Merge lp:~openerp-community/server-env-tools/7.0-modules-from-openobject-extension into lp:~server-env-tools-core-editors/server-env-tools/7.0

Status: Superseded
Proposed branch: lp:~openerp-community/server-env-tools/7.0-modules-from-openobject-extension
Merge into: lp:~server-env-tools-core-editors/server-env-tools/7.0
Diff against target: 841 lines (+764/-0)
13 files modified
base_external_dbsource/__init__.py (+24/-0)
base_external_dbsource/__openerp__.py (+61/-0)
base_external_dbsource/base_external_dbsource.py (+175/-0)
base_external_dbsource/base_external_dbsource_demo.xml (+15/-0)
base_external_dbsource/base_external_dbsource_view.xml (+54/-0)
base_external_dbsource/security/ir.model.access.csv (+2/-0)
base_external_dbsource/test/dbsource_connect.yml (+5/-0)
import_odbc/__init__.py (+24/-0)
import_odbc/__openerp__.py (+83/-0)
import_odbc/import_odbc.py (+218/-0)
import_odbc/import_odbc_demo.xml (+15/-0)
import_odbc/import_odbc_view.xml (+86/-0)
import_odbc/security/ir.model.access.csv (+2/-0)
To merge this branch: bzr merge lp:~openerp-community/server-env-tools/7.0-modules-from-openobject-extension
Reviewer Review Type Date Requested Status
Yannick Vaucher @ Camptocamp code review, no tests Needs Fixing
Nicolas Bessi - Camptocamp (community) code review, no tests Needs Fixing
Review via email: mp+183539@code.launchpad.net

This proposal has been superseded by a proposal from 2014-01-07.

Description of the change

[ADD] import_odbc and base_external_dbsource from lp:openobject-extension/7.0

To post a comment you must log in.
Revision history for this message
Nicolas Bessi - Camptocamp (nbessi-c2c-deactivatedaccount) wrote :

Hello,

Thanks for the revirew

Some comments below:

Please use explicit relative import: from . import

in __openerp__ remove the init key please it is unused

in base_external_source.py:

just a details but the order of import is not standard:
-atop should be standard lib call,
-below external used module,
-finally localy imported module

please use OpenERP/community convention to for the model:
from openerp.osv import orm, fields

class base_external_dbsource(orm.Model):

line 71: back slash is not needed
line 82 back slash an + is not needed

in conn_open
            if '%s' not in data.conn_string:
                connStr += ';PWD=%s'
I'm not sure this is a judicious choice.
What if password set in FREETDS config files

In function connexion_test you raise an exception to said that connection works.
It will be better to return an ir.act_windows.

A quick pass to a lynter to fix PEP8 will be nice.

In demo data used for yaml test

        <record model="base.external.dbsource" id="demo_postgre">
            <field name="name">PostgreSQL local</field>
            <field name="conn_string">dbname='postgres' password=%s</field>
            <field name="password">postgresql</field>
            <field name="connector">postgresql</field>
        </record>

Depends on postgres configuration. I have no better idea but a comment in test should be made to validate this first.

Thanks fro the patch. I will look if I can integrate it in connector_odbc.

Regards

Nicolas

review: Needs Fixing (code review, no tests)
Revision history for this message
Daniel Reis (dreis-pt) wrote :

Maxime: I can work on the cleanup of this code, if you like.

Revision history for this message
Maxime Chambreuil (http://www.savoirfairelinux.com) (max3903) wrote :

Thanks Daniel. Please go ahead.

52. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[FIX] PEP8 compliance and review comments

Revision history for this message
Maxime Chambreuil (http://www.savoirfairelinux.com) (max3903) wrote :

Daniel ?

Revision history for this message
Yannick Vaucher @ Camptocamp (yvaucher-c2c) wrote :

Please remove executable right on those files (chmod -x):

base_external_dbsource/base_external_dbsource_demo.xml
base_external_dbsource/__init__.py
base_external_dbsource/security/ir.model.access.csv
base_external_dbsource/test/dbsource_connect.yml
import_odbc/import_odbc_demo.xml
import_odbc/__init__.py
import_odbc/security/ir.model.access.csv

base_external_dbsource.py|33 col 1| F401 'pymssql' imported but unused
base_external_dbsource.py|40 col 1| F811 redefinition of unused 'sqlalchemy' from line 32
base_external_dbsource.py|41 col 1| F401 'MySQLdb' imported but unused
base_external_dbsource.py|65 col 1| F811 redefinition of unused 'sqlalchemy' from line 40

base_external_dbsource.py|166 col 1| F821 undefined name 'osv'
base_external_dbsource.py|177 col 1| F821 undefined name 'osv'

Use orm.execpt_orm instead

And at least split the l.43 in import_odbc/import_odbc

Cheers

review: Needs Fixing (code review, no tests)
53. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[FIX] PEP8 and file permission issues

Revision history for this message
Maxime Chambreuil (http://www.savoirfairelinux.com) (max3903) wrote :

Thanks Yannick.

I made the changes according to your review.

54. By Maxime Chambreuil (http://www.savoirfairelinux.com)

[FIX] Menu, views and version number

Unmerged revisions

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== added directory 'base_external_dbsource'
=== added file 'base_external_dbsource/__init__.py'
--- base_external_dbsource/__init__.py 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/__init__.py 2013-12-20 18:51:45 +0000
@@ -0,0 +1,24 @@
1# -*- coding: utf-8 -*-
2##############################################################################
3#
4# Daniel Reis
5# 2011
6#
7# This program is free software: you can redistribute it and/or modify
8# it under the terms of the GNU Affero General Public License as
9# published by the Free Software Foundation, either version 3 of the
10# License, or (at your option) any later version.
11#
12# This program is distributed in the hope that it will be useful,
13# but WITHOUT ANY WARRANTY; without even the implied warranty of
14# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15# GNU Affero General Public License for more details.
16#
17# You should have received a copy of the GNU Affero General Public License
18# along with this program. If not, see <http://www.gnu.org/licenses/>.
19#
20##############################################################################
21
22from . import base_external_dbsource
23
24# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
025
=== added file 'base_external_dbsource/__openerp__.py'
--- base_external_dbsource/__openerp__.py 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/__openerp__.py 2013-12-20 18:51:45 +0000
@@ -0,0 +1,61 @@
1# -*- coding: utf-8 -*-
2##############################################################################
3#
4# Daniel Reis, 2011
5# Additional contributions by Maxime Chambreuil, Savoir-faire Linux
6#
7# This program is free software: you can redistribute it and/or modify
8# it under the terms of the GNU Affero General Public License as
9# published by the Free Software Foundation, either version 3 of the
10# License, or (at your option) any later version.
11#
12# This program is distributed in the hope that it will be useful,
13# but WITHOUT ANY WARRANTY; without even the implied warranty of
14# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15# GNU Affero General Public License for more details.
16#
17# You should have received a copy of the GNU Affero General Public License
18# along with this program. If not, see <http://www.gnu.org/licenses/>.
19#
20##############################################################################
21
22{
23 'name': 'External Database Sources',
24 'version': '61.3',
25 'category': 'Tools',
26 'description': """
27This module allows you to define connections to foreign databases using ODBC,
28Oracle Client or SQLAlchemy.
29
30Database sources can be configured in Settings > Configuration -> Data sources.
31
32Depending on the database, you need:
33 * to install unixodbc and python-pyodbc packages to use ODBC connections.
34 * to install FreeTDS driver (tdsodbc package) and configure it through ODBC to
35 connect to Microsoft SQL Server.
36 * to install and configure Oracle Instant Client and cx_Oracle python library
37 to connect to Oracle.
38 """,
39 'author': 'Daniel Reis',
40 'website': 'http://launchpad.net/addons-tko',
41 'images': [
42 'images/screenshot01.png',
43 ],
44 'depends': [
45 'base',
46 ],
47 'data': [
48 'base_external_dbsource_view.xml',
49 'security/ir.model.access.csv',
50 ],
51 'demo': [
52 'base_external_dbsource_demo.xml',
53 ],
54 'test': [
55 'dbsource_connect.yml',
56 ],
57 'installable': True,
58 'active': False,
59}
60
61# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
062
=== added file 'base_external_dbsource/base_external_dbsource.py'
--- base_external_dbsource/base_external_dbsource.py 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/base_external_dbsource.py 2013-12-20 18:51:45 +0000
@@ -0,0 +1,175 @@
1# -*- coding: utf-8 -*-
2##############################################################################
3#
4# Daniel Reis
5# 2011
6#
7# This program is free software: you can redistribute it and/or modify
8# it under the terms of the GNU Affero General Public License as
9# published by the Free Software Foundation, either version 3 of the
10# License, or (at your option) any later version.
11#
12# This program is distributed in the hope that it will be useful,
13# but WITHOUT ANY WARRANTY; without even the implied warranty of
14# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15# GNU Affero General Public License for more details.
16#
17# You should have received a copy of the GNU Affero General Public License
18# along with this program. If not, see <http://www.gnu.org/licenses/>.
19#
20##############################################################################
21
22import os
23import logging
24from openerp.osv import orm, fields
25from openerp.tools.translate import _
26import openerp.tools as tools
27_logger = logging.getLogger(__name__)
28
29CONNECTORS = []
30
31try:
32 import sqlalchemy
33 CONNECTORS.append(('sqlite', 'SQLite'))
34 try:
35 import pymssql
36 CONNECTORS.append(('mssql', 'Microsoft SQL Server'))
37 except:
38 _logger.info('MS SQL Server not available. Please install "pymssql"\
39 python package.')
40 try:
41 import MySQLdb
42 CONNECTORS.append(('mysql', 'MySQL'))
43 except:
44 _logger.info('MySQL not available. Please install "mysqldb"\
45 python package.')
46except:
47 _logger.info('SQL Alchemy not available. Please install "slqalchemy"\
48 python package.')
49try:
50 import pyodbc
51 CONNECTORS.append(('pyodbc', 'ODBC'))
52except:
53 _logger.info('ODBC libraries not available. Please install "unixodbc"\
54 and "python-pyodbc" packages.')
55
56try:
57 import cx_Oracle
58 CONNECTORS.append(('cx_Oracle', 'Oracle'))
59except:
60 _logger.info('Oracle libraries not available. Please install "cx_Oracle"\
61 python package.')
62
63import psycopg2
64CONNECTORS.append(('postgresql', 'PostgreSQL'))
65
66
67class base_external_dbsource(orm.Model):
68 _name = "base.external.dbsource"
69 _description = 'External Database Sources'
70 _columns = {
71 'name': fields.char('Datasource name', required=True, size=64),
72 'conn_string': fields.text('Connection string', help="""
73Sample connection strings:
74- Microsoft SQL Server:
75 mssql+pymssql://username:%s@server:port/dbname?charset=utf8
76- MySQL: mysql://user:%s@server:port/dbname
77- ODBC: DRIVER={FreeTDS};SERVER=server.address;Database=mydb;UID=sa
78- ORACLE: username/%s@//server.address:port/instance
79- PostgreSQL:
80 dbname='template1' user='dbuser' host='localhost' port='5432' password=%s
81- SQLite: sqlite:///test.db
82"""),
83 'password': fields.char('Password', size=40),
84 'connector': fields.selection(CONNECTORS, 'Connector',
85 required=True,
86 help="If a connector is missing from the\
87 list, check the server log to confirm\
88 that the required components were\
89 detected."),
90 }
91
92 def conn_open(self, cr, uid, id1):
93 #Get dbsource record
94 data = self.browse(cr, uid, id1)
95 #Build the full connection string
96 connStr = data.conn_string
97 if data.password:
98 if '%s' not in data.conn_string:
99 connStr += ';PWD=%s'
100 connStr = connStr % data.password
101 #Try to connect
102 if data.connector == 'cx_Oracle':
103 os.environ['NLS_LANG'] = 'AMERICAN_AMERICA.UTF8'
104 conn = cx_Oracle.connect(connStr)
105 elif data.connector == 'pyodbc':
106 conn = pyodbc.connect(connStr)
107 elif data.connector in ('sqlite', 'mysql', 'mssql'):
108 conn = sqlalchemy.create_engine(connStr).connect()
109 elif data.connector == 'postgresql':
110 conn = psycopg2.connect(connStr)
111
112 return conn
113
114 def execute(self, cr, uid, ids, sqlquery, sqlparams=None, metadata=False,
115 context=None):
116 """Executes SQL and returns a list of rows.
117
118 "sqlparams" can be a dict of values, that can be referenced in
119 the SQL statement using "%(key)s" or, in the case of Oracle,
120 ":key".
121 Example:
122 sqlquery = "select * from mytable where city = %(city)s and
123 date > %(dt)s"
124 params = {'city': 'Lisbon',
125 'dt': datetime.datetime(2000, 12, 31)}
126
127 If metadata=True, it will instead return a dict containing the
128 rows list and the columns list, in the format:
129 { 'cols': [ 'col_a', 'col_b', ...]
130 , 'rows': [ (a0, b0, ...), (a1, b1, ...), ...] }
131 """
132 data = self.browse(cr, uid, ids)
133 rows, cols = list(), list()
134 for obj in data:
135 conn = self.conn_open(cr, uid, obj.id)
136 if obj.connector in ["sqlite", "mysql", "mssql"]:
137 #using sqlalchemy
138 cur = conn.execute(sqlquery, sqlparams)
139 if metadata:
140 cols = cur.keys()
141 rows = [r for r in cur]
142 else:
143 #using other db connectors
144 cur = conn.cursor()
145 cur.execute(sqlquery, sqlparams)
146 if metadata:
147 cols = [d[0] for d in cur.description]
148 rows = cur.fetchall()
149 conn.close()
150 if metadata:
151 return{'cols': cols, 'rows': rows}
152 else:
153 return rows
154
155 def connection_test(self, cr, uid, ids, context=None):
156 for obj in self.browse(cr, uid, ids, context):
157 conn = False
158 try:
159 conn = self.conn_open(cr, uid, obj.id)
160 except Exception, e:
161 raise orm.except_orm(_("Connection test failed!"),
162 _("Here is what we got instead:\n %s")
163 % tools.ustr(e))
164 finally:
165 try:
166 if conn:
167 conn.close()
168 except Exception:
169 # ignored, just a consequence of the previous exception
170 pass
171 #TODO: if OK a (wizard) message box should be displayed
172 raise orm.except_orm(_("Connection test succeeded!"),
173 _("Everything seems properly set up!"))
174
175#EOF
0176
=== added file 'base_external_dbsource/base_external_dbsource_demo.xml'
--- base_external_dbsource/base_external_dbsource_demo.xml 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/base_external_dbsource_demo.xml 2013-12-20 18:51:45 +0000
@@ -0,0 +1,15 @@
1<?xml version="1.0"?>
2<openerp>
3 <data>
4
5 <record model="base.external.dbsource" id="demo_postgre">
6 <field name="name">PostgreSQL local</field>
7 <field name="conn_string">dbname='postgres' password=%s</field>
8 <field name="password">postgresql</field>
9 <field name="connector">postgresql</field>
10 </record>
11
12 </data>
13</openerp>
14
15
016
=== added file 'base_external_dbsource/base_external_dbsource_view.xml'
--- base_external_dbsource/base_external_dbsource_view.xml 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/base_external_dbsource_view.xml 2013-12-20 18:51:45 +0000
@@ -0,0 +1,54 @@
1<?xml version="1.0"?>
2<openerp>
3 <data>
4
5 <!-- DBSource -->
6
7 <record model="ir.ui.view" id="view_dbsource_tree">
8 <field name="name">base.external.dbsource.tree</field>
9 <field name="model">base.external.dbsource</field>
10 <field name="type">tree</field>
11 <field name="arch" type="xml">
12 <tree string="External DB Sources">
13 <field name="name"/>
14 <field name="connector"/>
15 <field name="conn_string"/>
16 </tree>
17 </field>
18 </record>
19
20 <record model="ir.ui.view" id="view_dbsource_form">
21 <field name="name">base.external.dbsource.form</field>
22 <field name="model">base.external.dbsource</field>
23 <field name="type">form</field>
24 <field name="arch" type="xml">
25 <form string="External DB Source">
26 <field name="name"/>
27 <field name="password" password="True"/>
28 <newline/>
29 <field name="connector" colspan="2"/>
30 <newline/>
31 <field name="conn_string" colspan="4"/>
32 <newline/>
33 <button name="connection_test" string="Test Connection" type="object" icon="gtk-network" colspan="4"/>
34 </form>
35 </field>
36 </record>
37
38 <record model="ir.actions.act_window" id="action_dbsource">
39 <field name="name">External Database Sources</field>
40 <field name="res_model">base.external.dbsource</field>
41 <field name="view_type">form</field>
42 <field name="view_mode">tree,form</field>
43 <field name="view_id" ref="view_dbsource_tree"/>
44 </record>
45
46 <menuitem name="Database Sources"
47 id="menu_dbsource"
48 parent="base.next_id_9"
49 action="action_dbsource"/>
50
51 </data>
52</openerp>
53
54
055
=== added directory 'base_external_dbsource/images'
=== added file 'base_external_dbsource/images/screenshot01.png'
1Binary files base_external_dbsource/images/screenshot01.png 1970-01-01 00:00:00 +0000 and base_external_dbsource/images/screenshot01.png 2013-12-20 18:51:45 +0000 differ56Binary files base_external_dbsource/images/screenshot01.png 1970-01-01 00:00:00 +0000 and base_external_dbsource/images/screenshot01.png 2013-12-20 18:51:45 +0000 differ
=== added directory 'base_external_dbsource/security'
=== added file 'base_external_dbsource/security/ir.model.access.csv'
--- base_external_dbsource/security/ir.model.access.csv 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/security/ir.model.access.csv 2013-12-20 18:51:45 +0000
@@ -0,0 +1,2 @@
1id,name,model_id:id,group_id:id,perm_read,perm_write,perm_create,perm_unlink
2access_base_external_dbsource_group_system,bae_external_dbsource_group_system,model_base_external_dbsource,base.group_system,1,1,1,1
03
=== added directory 'base_external_dbsource/test'
=== added file 'base_external_dbsource/test/dbsource_connect.yml'
--- base_external_dbsource/test/dbsource_connect.yml 1970-01-01 00:00:00 +0000
+++ base_external_dbsource/test/dbsource_connect.yml 2013-12-20 18:51:45 +0000
@@ -0,0 +1,5 @@
1-
2 Connect to local Postgres.
3-
4 !python {model: base.external.dbsource}: |
5 self.connection_test(cr, uid, [ref("demo_postgresql")]
06
=== added directory 'import_odbc'
=== added file 'import_odbc/__init__.py'
--- import_odbc/__init__.py 1970-01-01 00:00:00 +0000
+++ import_odbc/__init__.py 2013-12-20 18:51:45 +0000
@@ -0,0 +1,24 @@
1# -*- coding: utf-8 -*-
2##############################################################################
3#
4# Daniel Reis
5# 2011
6#
7# This program is free software: you can redistribute it and/or modify
8# it under the terms of the GNU Affero General Public License as
9# published by the Free Software Foundation, either version 3 of the
10# License, or (at your option) any later version.
11#
12# This program is distributed in the hope that it will be useful,
13# but WITHOUT ANY WARRANTY; without even the implied warranty of
14# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15# GNU Affero General Public License for more details.
16#
17# You should have received a copy of the GNU Affero General Public License
18# along with this program. If not, see <http://www.gnu.org/licenses/>.
19#
20##############################################################################
21
22from . import import_odbc
23
24# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
025
=== added file 'import_odbc/__openerp__.py'
--- import_odbc/__openerp__.py 1970-01-01 00:00:00 +0000
+++ import_odbc/__openerp__.py 2013-12-20 18:51:45 +0000
@@ -0,0 +1,83 @@
1# -*- coding: utf-8 -*-
2##############################################################################
3#
4# Daniel Reis
5# 2011
6#
7# This program is free software: you can redistribute it and/or modify
8# it under the terms of the GNU Affero General Public License as
9# published by the Free Software Foundation, either version 3 of the
10# License, or (at your option) any later version.
11#
12# This program is distributed in the hope that it will be useful,
13# but WITHOUT ANY WARRANTY; without even the implied warranty of
14# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15# GNU Affero General Public License for more details.
16#
17# You should have received a copy of the GNU Affero General Public License
18# along with this program. If not, see <http://www.gnu.org/licenses/>.
19#
20##############################################################################
21
22{
23 'name': 'Import data from SQL and ODBC data sources.',
24 'version': '61.3',
25 'category': 'Tools',
26 'description': """
27Import data directly from other databases.
28
29Installed in the Administration module, menu Configuration -> Import from SQL.
30
31Features:
32 * Fetched data from the databases are used to build lines equivalent to regular import files. These are imported using the standard "import_data()" ORM method, benefiting from all its features, including xml_ids.
33 * Each table import is defined by an SQL statement, used to build the equivalent for an import file. Each column's name should match the column names you would use in an import file. The first column must provide an unique identifier for the record, and will be used to build its xml_id.
34 * SQL columns named "none" are ignored. This can be used for the first column of the SQL, so that it's used to build the XML Id but it's not imported to any OpenERP field.
35 * The last sync date is the last successfull execution can be used in the SQL using "%(sync)s", or ":sync" in the case of Oracle.
36 * When errors are found, only the record with the error fails import. The other correct records are commited. However, the "last sync date" will only be automaticaly updated when no errors are found.
37 * The import execution can be scheduled to run automatically.
38
39Examples:
40 * Importing suppliers to res.partner:
41 SELECT distinct
42 [SUPPLIER_CODE] as "ref"
43 , [SUPPLIER_NAME] as "name"
44 , 1 as "is_supplier"
45 , [INFO] as "comment"
46 FROM T_SUPPLIERS
47 WHERE INACTIVE_DATE IS NULL and DATE_CHANGED >= %(sync)s
48
49 * Importing products to product.product:
50 SELECT PRODUCT_CODE as "ref"
51 , PRODUCT_NAME as "name"
52 , 'res_partner_id_'+SUPPLIER_ID as "partner_id/id"
53 FROM T_PRODUCTS
54 WHERE DATE_CHANGED >= %(sync)s
55
56Improvements ideas waiting for a contributor:
57 * Allow to import many2one fields (currently not supported). Done by adding a second SQL sentence to get child record list?
58 * Allow "import sets" that can be executed at different time intervals using different scheduler jobs.
59 * Allow to inactivate/delete OpenERP records when not present in an SQL result set.
60 """,
61 'author': 'Daniel Reis',
62 'website': 'http://launchpad.net/addons-tko',
63 'images': [
64 'images/snapshot1.png',
65 'images/snapshot2.png',
66 ],
67 'depends': [
68 'base',
69 'base_external_dbsource',
70 ],
71 'data': [
72 'import_odbc_view.xml',
73 'security/ir.model.access.csv',
74 ],
75 'demo': [
76 'import_odbc_demo.xml',
77 ],
78 'test': [],
79 'installable': True,
80 'active': False,
81}
82
83# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
084
=== added directory 'import_odbc/images'
=== added file 'import_odbc/images/snapshot1.png'
1Binary files import_odbc/images/snapshot1.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot1.png 2013-12-20 18:51:45 +0000 differ85Binary files import_odbc/images/snapshot1.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot1.png 2013-12-20 18:51:45 +0000 differ
=== added file 'import_odbc/images/snapshot2.png'
2Binary files import_odbc/images/snapshot2.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot2.png 2013-12-20 18:51:45 +0000 differ86Binary files import_odbc/images/snapshot2.png 1970-01-01 00:00:00 +0000 and import_odbc/images/snapshot2.png 2013-12-20 18:51:45 +0000 differ
=== added file 'import_odbc/import_odbc.py'
--- import_odbc/import_odbc.py 1970-01-01 00:00:00 +0000
+++ import_odbc/import_odbc.py 2013-12-20 18:51:45 +0000
@@ -0,0 +1,218 @@
1# -*- coding: utf-8 -*-
2##############################################################################
3#
4# Daniel Reis
5# 2011
6#
7# This program is free software: you can redistribute it and/or modify
8# it under the terms of the GNU Affero General Public License as
9# published by the Free Software Foundation, either version 3 of the
10# License, or (at your option) any later version.
11#
12# This program is distributed in the hope that it will be useful,
13# but WITHOUT ANY WARRANTY; without even the implied warranty of
14# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15# GNU Affero General Public License for more details.
16#
17# You should have received a copy of the GNU Affero General Public License
18# along with this program. If not, see <http://www.gnu.org/licenses/>.
19#
20##############################################################################
21
22import sys
23from datetime import datetime
24from openerp.osv import orm, fields
25import logging
26_logger = logging.getLogger(__name__)
27_loglvl = _logger.getEffectiveLevel()
28SEP = '|'
29
30
31class import_odbc_dbtable(orm.Model):
32 _name = "import.odbc.dbtable"
33 _description = 'Import Table Data'
34 _order = 'exec_order'
35 _columns = {
36 'name': fields.char('Datasource name', required=True, size=64),
37 'enabled': fields.boolean('Execution enabled'),
38 'dbsource_id': fields.many2one('base.external.dbsource', 'Database source', required=True),
39 'sql_source': fields.text('SQL', required=True, help='Column names must be valid "import_data" columns.'),
40 'model_target': fields.many2one('ir.model', 'Target object'),
41 'noupdate': fields.boolean('No updates', help="Only create new records; disable updates to existing records."),
42 'exec_order': fields.integer('Execution order', help="Defines the order to perform the import"),
43 'last_sync': fields.datetime('Last sync date',
44 help="Datetime for the last succesfull sync."
45 "\nLater changes on the source may not be replicated on the destination"),
46 'start_run': fields.datetime('Time started', readonly=True),
47 'last_run': fields.datetime('Time ended', readonly=True),
48 'last_record_count': fields.integer('Last record count', readonly=True),
49 'last_error_count': fields.integer('Last error count', readonly=True),
50 'last_warn_count': fields.integer('Last warning count', readonly=True),
51 'last_log': fields.text('Last run log', readonly=True),
52 'ignore_rel_errors': fields.boolean('Ignore relationship errors',
53 help="On error try to reimport rows ignoring relationships."),
54 'raise_import_errors': fields.boolean('Raise import errors',
55 help="Import errors not handled, intended for debugging purposes."
56 "\nAlso forces debug messages to be written to the server log."),
57 }
58 _defaults = {
59 'enabled': True,
60 'exec_order': 10,
61 }
62
63 def _import_data(self, cr, uid, flds, data, model_obj, table_obj, log):
64 """Import data and returns error msg or empty string"""
65
66 def find_m2o(field_list):
67 """"Find index of first column with a one2many field"""
68 for i, x in enumerate(field_list):
69 if len(x) > 3 and x[-3:] == ':id' or x[-3:] == '/id':
70 return i
71 return -1
72
73 def append_to_log(log, level, obj_id='', msg='', rel_id=''):
74 if '_id_' in obj_id:
75 obj_id = '.'.join(obj_id.split('_')[:-2]) + ': ' + obj_id.split('_')[-1]
76 if ': .' in msg and not rel_id:
77 rel_id = msg[msg.find(': .')+3:]
78 if '_id_' in rel_id:
79 rel_id = '.'.join(rel_id.split('_')[:-2]) + ': ' + rel_id.split('_')[-1]
80 msg = msg[:msg.find(': .')]
81 log['last_log'].append('%s|%s\t|%s\t|%s' % (level.ljust(5), obj_id, rel_id, msg))
82 _logger.debug(data)
83 cols = list(flds) # copy to avoid side effects
84 errmsg = str()
85 if table_obj.raise_import_errors:
86 model_obj.import_data(cr, uid, cols, [data], noupdate=table_obj.noupdate)
87 else:
88 try:
89 model_obj.import_data(cr, uid, cols, [data], noupdate=table_obj.noupdate)
90 except:
91 errmsg = str(sys.exc_info()[1])
92 if errmsg and not table_obj.ignore_rel_errors:
93 #Fail
94 append_to_log(log, 'ERROR', data, errmsg)
95 log['last_error_count'] += 1
96 return False
97 if errmsg and table_obj.ignore_rel_errors:
98 #Warn and retry ignoring many2one fields...
99 append_to_log(log, 'WARN', data, errmsg)
100 log['last_warn_count'] += 1
101 #Try ignoring each many2one (tip: in the SQL sentence select more problematic FKs first)
102 i = find_m2o(cols)
103 if i >= 0:
104 #Try again without the [i] column
105 del cols[i]
106 del data[i]
107 self._import_data(cr, uid, cols, data, model_obj, table_obj, log)
108 else:
109 #Fail
110 append_to_log(log, 'ERROR', data, 'Removed all m2o keys and still fails.')
111 log['last_error_count'] += 1
112 return False
113 return True
114
115 def import_run(self, cr, uid, ids=None, context=None):
116 db_model = self.pool.get('base.external.dbsource')
117 actions = self.read(cr, uid, ids, ['id', 'exec_order'])
118 actions.sort(key=lambda x: (x['exec_order'], x['id']))
119
120 #Consider each dbtable:
121 for action_ref in actions:
122 obj = self.browse(cr, uid, action_ref['id'])
123 if not obj.enabled:
124 continue # skip
125
126 _logger.setLevel(obj.raise_import_errors and logging.DEBUG or _loglvl)
127 _logger.debug('Importing %s...' % obj.name)
128
129 #now() microseconds are stripped to avoid problem with SQL smalldate
130 #TODO: convert UTC Now to local timezone
131 #http://stackoverflow.com/questions/4770297/python-convert-utc-datetime-string-to-local-datetime
132 model_name = obj.model_target.model
133 model_obj = self.pool.get(model_name)
134 xml_prefix = model_name.replace('.', '_') + "_id_"
135 log = {'start_run': datetime.now().replace(microsecond=0),
136 'last_run': None,
137 'last_record_count': 0,
138 'last_error_count': 0,
139 'last_warn_count': 0,
140 'last_log': list()}
141 self.write(cr, uid, [obj.id], log)
142
143 #Prepare SQL sentence; replace "%s" with the last_sync date
144 if obj.last_sync:
145 sync = datetime.strptime(obj.last_sync, "%Y-%m-%d %H:%M:%S")
146 else:
147 sync = datetime.datetime(1900, 1, 1, 0, 0, 0)
148 params = {'sync': sync}
149 res = db_model.execute(cr, uid, [obj.dbsource_id.id],
150 obj.sql_source, params, metadata=True)
151
152 #Exclude columns titled "None"; add (xml_)"id" column
153 cidx = [i for i, x in enumerate(res['cols']) if x.upper() != 'NONE']
154 cols = [x for i, x in enumerate(res['cols']) if x.upper() != 'NONE'] + ['id']
155
156 #Import each row:
157 for row in res['rows']:
158 #Build data row; import only columns present in the "cols" list
159 data = list()
160 for i in cidx:
161 #TODO: Handle imported datetimes properly - convert from localtime to UTC!
162 v = row[i]
163 if isinstance(v, str):
164 v = v.strip()
165 data.append(v)
166 data.append(xml_prefix + str(row[0]).strip())
167
168 #Import the row; on error, write line to the log
169 log['last_record_count'] += 1
170 self._import_data(cr, uid, cols, data, model_obj, obj, log)
171 if log['last_record_count'] % 500 == 0:
172 _logger.info('...%s rows processed...' % (log['last_record_count']))
173
174 #Finished importing all rows
175 #If no errors, write new sync date
176 if not (log['last_error_count'] or log['last_warn_count']):
177 log['last_sync'] = log['start_run']
178 level = logging.DEBUG
179 if log['last_warn_count']:
180 level = logging.WARN
181 if log['last_error_count']:
182 level = logging.ERROR
183 _logger.log(level, 'Imported %s , %d rows, %d errors, %d warnings.' % (
184 model_name, log['last_record_count'], log['last_error_count'],
185 log['last_warn_count']))
186 #Write run log, either if the table import is active or inactive
187 if log['last_log']:
188 log['last_log'].insert(0, 'LEVEL|== Line == |== Relationship ==|== Message ==')
189 log.update({'last_log': '\n'.join(log['last_log'])})
190 log.update({'last_run': datetime.now().replace(microsecond=0)})
191 self.write(cr, uid, [obj.id], log)
192
193 #Finished
194 _logger.debug('Import job FINISHED.')
195 return True
196
197 def import_schedule(self, cr, uid, ids, context=None):
198 cron_obj = self.pool.get('ir.cron')
199 new_create_id = cron_obj.create(cr, uid, {
200 'name': 'Import ODBC tables',
201 'interval_type': 'hours',
202 'interval_number': 1,
203 'numbercall': -1,
204 'model': 'import.odbc.dbtable',
205 'function': 'import_run',
206 'doall': False,
207 'active': True
208 })
209 return {
210 'name': 'Import ODBC tables',
211 'view_type': 'form',
212 'view_mode': 'form,tree',
213 'res_model': 'ir.cron',
214 'res_id': new_create_id,
215 'type': 'ir.actions.act_window',
216 }
217
218#EOF
0219
=== added file 'import_odbc/import_odbc_demo.xml'
--- import_odbc/import_odbc_demo.xml 1970-01-01 00:00:00 +0000
+++ import_odbc/import_odbc_demo.xml 2013-12-20 18:51:45 +0000
@@ -0,0 +1,15 @@
1<?xml version="1.0"?>
2<openerp>
3 <data>
4
5 <record model="import.odbc.dbtable" id="demo_postgresql_users">
6 <field name="name">Users from PostgreSQL </field>
7 <field name="dbsource_id" ref="base_external_dbsource.demo_postgre"/>
8 <field name="sql_source">select usename as "login", usename as "name" from pg_catalog.pg_user</field>
9 <field name="model_target" ref="base.model_res_users"/>
10 </record>
11
12 </data>
13</openerp>
14
15
016
=== added file 'import_odbc/import_odbc_view.xml'
--- import_odbc/import_odbc_view.xml 1970-01-01 00:00:00 +0000
+++ import_odbc/import_odbc_view.xml 2013-12-20 18:51:45 +0000
@@ -0,0 +1,86 @@
1<?xml version="1.0"?>
2<openerp>
3 <data>
4
5 <!-- Table form -->
6
7 <record model="ir.ui.view" id="view_import_dbtable_form">
8 <field name="name">import.odbc.dbtable.form</field>
9 <field name="model">import.odbc.dbtable</field>
10 <field name="type">form</field>
11 <field name="arch" type="xml">
12 <form>
13 <field name="name" search="1"/>
14 <field name="exec_order"/>
15 <field name="model_target"/>
16 <field name="dbsource_id" search="1"/>
17 <field name="noupdate"/>
18 <field name="enabled"/>
19 <field name="ignore_rel_errors"/>
20 <field name="raise_import_errors"/>
21 <field name="last_sync"/>
22 <group colspan="2">
23 <button name="import_run" string="Run Import" type="object" icon="gtk-execute"/>
24 <button name="import_schedule" string="Schedule Import" type="object" icon="gtk-paste"/>
25 </group>
26 <field name="sql_source" colspan="4"/>
27 <separator string="Last execution" colspan="4"/>
28 <field name="last_record_count"/>
29 <field name="start_run"/>
30 <field name="last_warn_count"/>
31 <field name="last_run"/>
32 <field name="last_error_count"/>
33 <field name="last_log" colspan="4"/>
34 </form>
35 </field>
36 </record>
37
38 <!-- Table Tree -->
39
40 <record id="view_import_dbtable_tree" model="ir.ui.view">
41 <field name="name">import.odbc.dbtable.tree</field>
42 <field name="model">import.odbc.dbtable</field>
43 <field name="type">tree</field>
44 <field name="arch" type="xml">
45 <tree colors="grey: enabled==False; red:last_error_count&gt;0; blue:last_warn_count&gt;0">
46 <field name="exec_order"/>
47 <field name="name"/>
48 <field name="model_target"/>
49 <field name="dbsource_id"/>
50 <field name="enabled"/>
51 <field name="last_run"/>
52 <field name="last_sync"/>
53 <field name="last_record_count"/>
54 <field name="last_error_count"/>
55 <field name="last_warn_count"/>
56 </tree>
57 </field>
58</record>
59
60
61<!-- Tree Search -->
62 <record id="view_import_dbtable_filter" model="ir.ui.view">
63 <field name="name">import.odbc.dbtable.filter</field>
64 <field name="model">import.odbc.dbtable</field>
65 <field name="type">search</field>
66 <field name="arch" type="xml">
67 <search string="Search ODBC Imports">
68 <field name="name"/>
69 <field name="dbsource_id"/>
70 <field name="model_target"/>
71 </search>
72 </field>
73 </record>
74
75<!--Menu-->
76 <record model="ir.actions.act_window" id="action_import_dbtable">
77 <field name="name">Import from SQL</field>
78 <field name="res_model">import.odbc.dbtable</field>
79 <field name="view_type">form</field>
80 </record>
81 <menuitem name="Import from SQL" id="menu_import_dbtable" parent="base.next_id_15" action="action_import_dbtable"/>
82</data>
83</openerp>
84
85
86
087
=== added directory 'import_odbc/security'
=== added file 'import_odbc/security/ir.model.access.csv'
--- import_odbc/security/ir.model.access.csv 1970-01-01 00:00:00 +0000
+++ import_odbc/security/ir.model.access.csv 2013-12-20 18:51:45 +0000
@@ -0,0 +1,2 @@
1id,name,model_id:id,group_id:id,perm_read,perm_write,perm_create,perm_unlink
2access_import_odbc_dbsource_group_system,import_odbc_dbtable_group_system,model_import_odbc_dbtable,base.group_system,1,1,1,1