Merge lp:~zorba-coders/zorba/feature-ft_module into lp:zorba
- feature-ft_module
- Merge into trunk
Status: | Merged | ||||
---|---|---|---|---|---|
Approved by: | Matthias Brantner | ||||
Approved revision: | no longer in the source branch. | ||||
Merged at revision: | 10852 | ||||
Proposed branch: | lp:~zorba-coders/zorba/feature-ft_module | ||||
Merge into: | lp:zorba | ||||
Diff against target: |
455 lines (+76/-76) 16 files modified
doc/zorba/ft_tokenizer.dox (+1/-1) modules/com/zorba-xquery/www/modules/full-text.xq (+8/-8) src/functions/func_ft_module_impl.cpp (+8/-8) src/functions/func_ft_module_impl.h (+3/-3) src/functions/function_consts.h (+2/-2) src/runtime/full_text/ft_module_impl.cpp (+15/-14) src/runtime/full_text/pregenerated/ft_module.cpp (+15/-15) src/runtime/full_text/pregenerated/ft_module.h (+9/-9) src/runtime/spec/full_text/ft_module.xml (+1/-1) src/runtime/visitors/pregenerated/planiter_visitor.h (+3/-3) src/runtime/visitors/pregenerated/printer_visitor.cpp (+5/-5) src/runtime/visitors/pregenerated/printer_visitor.h (+2/-2) test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-1.xq (+1/-1) test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-2.xq (+1/-2) test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-3.xq (+1/-1) test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-4.xq (+1/-1) |
||||
To merge this branch: | bzr merge lp:~zorba-coders/zorba/feature-ft_module | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Matthias Brantner | Approve | ||
Paul J. Lucas | Approve | ||
Review via email: mp+106235@code.launchpad.net |
Commit message
Getting in another public API change for 2.5 for the full-text module since now's the time to do it. Renamed tokenize() to tokenize-node() for 2 reasons:
1. There already exists tokenize-string() and therefore tokenize-node() is a better name than just plain tokenize().
2. The forthcoming addition of the black & white tokenization function will most likely be called tokenize-nodes() -- plural.
Description of the change
Getting in another public API change for 2.5 for the full-text module since now's the time to do it. Renamed tokenize() to tokenize-node() for 2 reasons:
1. There already exists tokenize-string() and therefore tokenize-node() is a better name than just plain tokenize().
2. The forthcoming addition of the black & white tokenization function will most likely be called tokenize-nodes() -- plural.
Paul J. Lucas (paul-lucas) : | # |
Zorba Build Bot (zorba-buildbot) wrote : | # |
Zorba Build Bot (zorba-buildbot) wrote : | # |
The attempt to merge lp:~zorba-coders/zorba/feature-ft_module into lp:zorba failed. Below is the output from the failed tests.
CMake Error at /home/ceej/
Validation queue job feature-
finished. The final status was:
1 tests did not succeed - changes not commited.
Error in read script: /home/ceej/
Matthias Brantner (matthias-brantner) : | # |
Zorba Build Bot (zorba-buildbot) wrote : | # |
Validation queue starting for merge proposal.
Log at: http://
Zorba Build Bot (zorba-buildbot) wrote : | # |
Validation queue job feature-
All tests succeeded!
- 10852. By Paul J. Lucas
-
Getting in another public API change for 2.5 for the full-text module since now's the time to do it. Renamed tokenize() to tokenize-node() for 2 reasons:
1. There already exists tokenize-string() and therefore tokenize-node() is a better name than just plain tokenize().
2. The forthcoming addition of the black & white tokenization function will most likely be called tokenize-nodes() -- plural. Approved: Matthias Brantner, Paul J. Lucas
Preview Diff
1 | === modified file 'doc/zorba/ft_tokenizer.dox' | |||
2 | --- doc/zorba/ft_tokenizer.dox 2012-05-16 01:01:06 +0000 | |||
3 | +++ doc/zorba/ft_tokenizer.dox 2012-05-17 22:48:19 +0000 | |||
4 | @@ -152,7 +152,7 @@ | |||
5 | 152 | </tr> | 152 | </tr> |
6 | 153 | </table> | 153 | </table> |
7 | 154 | 154 | ||
9 | 155 | A complete implementation of \c %tokenize() is non-trivial | 155 | A complete implementation of \c %tokenize_string() is non-trivial |
10 | 156 | and therefore an example is beyond the scope of this API documentation. | 156 | and therefore an example is beyond the scope of this API documentation. |
11 | 157 | However, | 157 | However, |
12 | 158 | the things a tokenizer should take into consideration include: | 158 | the things a tokenizer should take into consideration include: |
13 | 159 | 159 | ||
14 | === modified file 'modules/com/zorba-xquery/www/modules/full-text.xq' | |||
15 | --- modules/com/zorba-xquery/www/modules/full-text.xq 2012-05-08 23:49:22 +0000 | |||
16 | +++ modules/com/zorba-xquery/www/modules/full-text.xq 2012-05-17 22:48:19 +0000 | |||
17 | @@ -762,7 +762,7 @@ | |||
18 | 762 | as xs:string+ external; | 762 | as xs:string+ external; |
19 | 763 | 763 | ||
20 | 764 | (:~ | 764 | (:~ |
22 | 765 | : Tokenizes the given document. | 765 | : Tokenizes the given node and all of its descendants. |
23 | 766 | : | 766 | : |
24 | 767 | : @param $node The node to tokenize. | 767 | : @param $node The node to tokenize. |
25 | 768 | : @param $lang The default | 768 | : @param $lang The default |
26 | @@ -770,13 +770,13 @@ | |||
27 | 770 | : of <code>$node</code>. | 770 | : of <code>$node</code>. |
28 | 771 | : @return a (possibly empty) sequence of tokens. | 771 | : @return a (possibly empty) sequence of tokens. |
29 | 772 | : @error err:FTST0009 if <code>$lang</code> is not supported in general. | 772 | : @error err:FTST0009 if <code>$lang</code> is not supported in general. |
31 | 773 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-1.xq | 773 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-1.xq |
32 | 774 | :) | 774 | :) |
34 | 775 | declare function ft:tokenize( $node as node(), $lang as xs:language ) | 775 | declare function ft:tokenize-node( $node as node(), $lang as xs:language ) |
35 | 776 | as element(ft-schema:token)* external; | 776 | as element(ft-schema:token)* external; |
36 | 777 | 777 | ||
37 | 778 | (:~ | 778 | (:~ |
39 | 779 | : Tokenizes the given document. | 779 | : Tokenizes the given node and all of its descendants. |
40 | 780 | : | 780 | : |
41 | 781 | : @param $node The node to tokenize. | 781 | : @param $node The node to tokenize. |
42 | 782 | : The document's default | 782 | : The document's default |
43 | @@ -785,11 +785,11 @@ | |||
44 | 785 | : @return a (possibly empty) sequence of tokens. | 785 | : @return a (possibly empty) sequence of tokens. |
45 | 786 | : @error err:FTST0009 if <code>ft:current-lang()</code> is not supported in | 786 | : @error err:FTST0009 if <code>ft:current-lang()</code> is not supported in |
46 | 787 | : general. | 787 | : general. |
50 | 788 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-2.xq | 788 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-2.xq |
51 | 789 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-3.xq | 789 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-3.xq |
52 | 790 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-4.xq | 790 | : @example test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-4.xq |
53 | 791 | :) | 791 | :) |
55 | 792 | declare function ft:tokenize( $node as node() ) | 792 | declare function ft:tokenize-node( $node as node() ) |
56 | 793 | as element(ft-schema:token)* external; | 793 | as element(ft-schema:token)* external; |
57 | 794 | 794 | ||
58 | 795 | (:~ | 795 | (:~ |
59 | 796 | 796 | ||
60 | === modified file 'src/functions/func_ft_module_impl.cpp' | |||
61 | --- src/functions/func_ft_module_impl.cpp 2012-05-15 21:13:21 +0000 | |||
62 | +++ src/functions/func_ft_module_impl.cpp 2012-05-17 22:48:19 +0000 | |||
63 | @@ -25,14 +25,14 @@ | |||
64 | 25 | 25 | ||
65 | 26 | #ifndef ZORBA_NO_FULL_TEXT | 26 | #ifndef ZORBA_NO_FULL_TEXT |
66 | 27 | 27 | ||
68 | 28 | PlanIter_t full_text_tokenize::codegen( | 28 | PlanIter_t full_text_tokenize_node::codegen( |
69 | 29 | CompilerCB*, | 29 | CompilerCB*, |
70 | 30 | static_context* sctx, | 30 | static_context* sctx, |
71 | 31 | const QueryLoc& loc, | 31 | const QueryLoc& loc, |
72 | 32 | std::vector<PlanIter_t>& argv, | 32 | std::vector<PlanIter_t>& argv, |
73 | 33 | expr& ann) const | 33 | expr& ann) const |
74 | 34 | { | 34 | { |
76 | 35 | return new TokenizeIterator(sctx, loc, argv); | 35 | return new TokenizeNodeIterator(sctx, loc, argv); |
77 | 36 | } | 36 | } |
78 | 37 | 37 | ||
79 | 38 | 38 | ||
80 | @@ -90,20 +90,20 @@ | |||
81 | 90 | false); | 90 | false); |
82 | 91 | { | 91 | { |
83 | 92 | DECL_WITH_KIND(sctx, | 92 | DECL_WITH_KIND(sctx, |
86 | 93 | full_text_tokenize, | 93 | full_text_tokenize_node, |
87 | 94 | (createQName(FT_MODULE_NS, "", "tokenize"), | 94 | (createQName(FT_MODULE_NS, "", "tokenize-node"), |
88 | 95 | GENV_TYPESYSTEM.ANY_NODE_TYPE_ONE, | 95 | GENV_TYPESYSTEM.ANY_NODE_TYPE_ONE, |
89 | 96 | tokenize_return_type), | 96 | tokenize_return_type), |
91 | 97 | FunctionConsts::FULL_TEXT_TOKENIZE_1); | 97 | FunctionConsts::FULL_TEXT_TOKENIZE_NODE_1); |
92 | 98 | } | 98 | } |
93 | 99 | { | 99 | { |
94 | 100 | DECL_WITH_KIND(sctx, | 100 | DECL_WITH_KIND(sctx, |
97 | 101 | full_text_tokenize, | 101 | full_text_tokenize_node, |
98 | 102 | (createQName( FT_MODULE_NS, "", "tokenize"), | 102 | (createQName( FT_MODULE_NS, "", "tokenize-node"), |
99 | 103 | GENV_TYPESYSTEM.ANY_NODE_TYPE_ONE, | 103 | GENV_TYPESYSTEM.ANY_NODE_TYPE_ONE, |
100 | 104 | GENV_TYPESYSTEM.LANGUAGE_TYPE_ONE, | 104 | GENV_TYPESYSTEM.LANGUAGE_TYPE_ONE, |
101 | 105 | tokenize_return_type), | 105 | tokenize_return_type), |
103 | 106 | FunctionConsts::FULL_TEXT_TOKENIZE_2); | 106 | FunctionConsts::FULL_TEXT_TOKENIZE_NODE_2); |
104 | 107 | } | 107 | } |
105 | 108 | 108 | ||
106 | 109 | xqtref_t tokenizer_properties_return_type = | 109 | xqtref_t tokenizer_properties_return_type = |
107 | 110 | 110 | ||
108 | === modified file 'src/functions/func_ft_module_impl.h' | |||
109 | --- src/functions/func_ft_module_impl.h 2012-05-09 20:40:03 +0000 | |||
110 | +++ src/functions/func_ft_module_impl.h 2012-05-17 22:48:19 +0000 | |||
111 | @@ -30,11 +30,11 @@ | |||
112 | 30 | /////////////////////////////////////////////////////////////////////////////// | 30 | /////////////////////////////////////////////////////////////////////////////// |
113 | 31 | 31 | ||
114 | 32 | //full-text:tokenize | 32 | //full-text:tokenize |
116 | 33 | class full_text_tokenize : public function | 33 | class full_text_tokenize_node : public function |
117 | 34 | { | 34 | { |
118 | 35 | public: | 35 | public: |
121 | 36 | full_text_tokenize(const signature& sig, FunctionConsts::FunctionKind kind) | 36 | full_text_tokenize_node(const signature& sig, |
122 | 37 | : | 37 | FunctionConsts::FunctionKind kind) : |
123 | 38 | function(sig, kind) | 38 | function(sig, kind) |
124 | 39 | { | 39 | { |
125 | 40 | 40 | ||
126 | 41 | 41 | ||
127 | === modified file 'src/functions/function_consts.h' | |||
128 | --- src/functions/function_consts.h 2012-05-08 23:49:22 +0000 | |||
129 | +++ src/functions/function_consts.h 2012-05-17 22:48:19 +0000 | |||
130 | @@ -229,8 +229,8 @@ | |||
131 | 229 | FULL_TEXT_CURRENT_COMPARE_OPTIONS_0, | 229 | FULL_TEXT_CURRENT_COMPARE_OPTIONS_0, |
132 | 230 | FULL_TEXT_TOKENIZER_PROPERTIES_1, | 230 | FULL_TEXT_TOKENIZER_PROPERTIES_1, |
133 | 231 | FULL_TEXT_TOKENIZER_PROPERTIES_0, | 231 | FULL_TEXT_TOKENIZER_PROPERTIES_0, |
136 | 232 | FULL_TEXT_TOKENIZE_2, | 232 | FULL_TEXT_TOKENIZE_NODE_2, |
137 | 233 | FULL_TEXT_TOKENIZE_1, | 233 | FULL_TEXT_TOKENIZE_NODE_1, |
138 | 234 | #endif | 234 | #endif |
139 | 235 | 235 | ||
140 | 236 | #include "functions/function_enum.h" | 236 | #include "functions/function_enum.h" |
141 | 237 | 237 | ||
142 | === modified file 'src/runtime/full_text/ft_module_impl.cpp' | |||
143 | --- src/runtime/full_text/ft_module_impl.cpp 2012-05-17 15:21:43 +0000 | |||
144 | +++ src/runtime/full_text/ft_module_impl.cpp 2012-05-17 22:48:19 +0000 | |||
145 | @@ -528,14 +528,15 @@ | |||
146 | 528 | 528 | ||
147 | 529 | /////////////////////////////////////////////////////////////////////////////// | 529 | /////////////////////////////////////////////////////////////////////////////// |
148 | 530 | 530 | ||
152 | 531 | TokenizeIterator::TokenizeIterator( static_context *sctx, QueryLoc const &loc, | 531 | TokenizeNodeIterator::TokenizeNodeIterator( static_context *sctx, |
153 | 532 | std::vector<PlanIter_t>& children ) : | 532 | QueryLoc const &loc, |
154 | 533 | NaryBaseIterator<TokenizeIterator,TokenizeIteratorState>(sctx, loc, children) | 533 | std::vector<PlanIter_t>& children ): |
155 | 534 | NaryBaseIterator<TokenizeNodeIterator,TokenizeNodeIteratorState>(sctx, loc, children) | ||
156 | 534 | { | 535 | { |
157 | 535 | initMembers(); | 536 | initMembers(); |
158 | 536 | } | 537 | } |
159 | 537 | 538 | ||
161 | 538 | void TokenizeIterator::initMembers() { | 539 | void TokenizeNodeIterator::initMembers() { |
162 | 539 | GENV_ITEMFACTORY->createQName( | 540 | GENV_ITEMFACTORY->createQName( |
163 | 540 | token_qname_, static_context::ZORBA_FULL_TEXT_FN_NS, "", "token" ); | 541 | token_qname_, static_context::ZORBA_FULL_TEXT_FN_NS, "", "token" ); |
164 | 541 | 542 | ||
165 | @@ -555,8 +556,8 @@ | |||
166 | 555 | ref_qname_, "", "", "node-ref" ); | 556 | ref_qname_, "", "", "node-ref" ); |
167 | 556 | } | 557 | } |
168 | 557 | 558 | ||
171 | 558 | bool TokenizeIterator::nextImpl( store::Item_t &result, | 559 | bool TokenizeNodeIterator::nextImpl( store::Item_t &result, |
172 | 559 | PlanState &plan_state ) const { | 560 | PlanState &plan_state ) const { |
173 | 560 | store::Item_t node_name, attr_node; | 561 | store::Item_t node_name, attr_node; |
174 | 561 | zstring base_uri; | 562 | zstring base_uri; |
175 | 562 | store::Item_t item; | 563 | store::Item_t item; |
176 | @@ -567,8 +568,8 @@ | |||
177 | 567 | store::Item_t type_name; | 568 | store::Item_t type_name; |
178 | 568 | zstring value_string; | 569 | zstring value_string; |
179 | 569 | 570 | ||
182 | 570 | TokenizeIteratorState *state; | 571 | TokenizeNodeIteratorState *state; |
183 | 571 | DEFAULT_STACK_INIT( TokenizeIteratorState, state, plan_state ); | 572 | DEFAULT_STACK_INIT( TokenizeNodeIteratorState, state, plan_state ); |
184 | 572 | 573 | ||
185 | 573 | if ( consumeNext( state->doc_item_, theChildren[0], plan_state ) ) { | 574 | if ( consumeNext( state->doc_item_, theChildren[0], plan_state ) ) { |
186 | 574 | if ( theChildren.size() > 1 ) { | 575 | if ( theChildren.size() > 1 ) { |
187 | @@ -651,19 +652,19 @@ | |||
188 | 651 | STACK_END( state ); | 652 | STACK_END( state ); |
189 | 652 | } | 653 | } |
190 | 653 | 654 | ||
193 | 654 | void TokenizeIterator::resetImpl( PlanState &plan_state ) const { | 655 | void TokenizeNodeIterator::resetImpl( PlanState &plan_state ) const { |
194 | 655 | NaryBaseIterator<TokenizeIterator,TokenizeIteratorState>:: | 656 | NaryBaseIterator<TokenizeNodeIterator,TokenizeNodeIteratorState>:: |
195 | 656 | resetImpl( plan_state ); | 657 | resetImpl( plan_state ); |
198 | 657 | TokenizeIteratorState *const state = | 658 | TokenizeNodeIteratorState *const state = |
199 | 658 | StateTraitsImpl<TokenizeIteratorState>::getState( | 659 | StateTraitsImpl<TokenizeNodeIteratorState>::getState( |
200 | 659 | plan_state, this->theStateOffset | 660 | plan_state, this->theStateOffset |
201 | 660 | ); | 661 | ); |
202 | 661 | state->doc_tokens_->reset(); | 662 | state->doc_tokens_->reset(); |
203 | 662 | } | 663 | } |
204 | 663 | 664 | ||
206 | 664 | void TokenizeIterator::serialize( serialization::Archiver &ar ) { | 665 | void TokenizeNodeIterator::serialize( serialization::Archiver &ar ) { |
207 | 665 | serialize_baseclass( | 666 | serialize_baseclass( |
209 | 666 | ar, (NaryBaseIterator<TokenizeIterator,TokenizeIteratorState>*)this | 667 | ar, (NaryBaseIterator<TokenizeNodeIterator,TokenizeNodeIteratorState>*)this |
210 | 667 | ); | 668 | ); |
211 | 668 | if ( !ar.is_serializing_out() ) | 669 | if ( !ar.is_serializing_out() ) |
212 | 669 | initMembers(); | 670 | initMembers(); |
213 | 670 | 671 | ||
214 | === modified file 'src/runtime/full_text/pregenerated/ft_module.cpp' | |||
215 | --- src/runtime/full_text/pregenerated/ft_module.cpp 2012-05-08 23:49:22 +0000 | |||
216 | +++ src/runtime/full_text/pregenerated/ft_module.cpp 2012-05-17 22:48:19 +0000 | |||
217 | @@ -295,12 +295,12 @@ | |||
218 | 295 | 295 | ||
219 | 296 | #endif | 296 | #endif |
220 | 297 | #ifndef ZORBA_NO_FULL_TEXT | 297 | #ifndef ZORBA_NO_FULL_TEXT |
227 | 298 | // <TokenizeIterator> | 298 | // <TokenizeNodeIterator> |
228 | 299 | TokenizeIterator::class_factory<TokenizeIterator> | 299 | TokenizeNodeIterator::class_factory<TokenizeNodeIterator> |
229 | 300 | TokenizeIterator::g_class_factory; | 300 | TokenizeNodeIterator::g_class_factory; |
230 | 301 | 301 | ||
231 | 302 | 302 | ||
232 | 303 | void TokenizeIterator::accept(PlanIterVisitor& v) const { | 303 | void TokenizeNodeIterator::accept(PlanIterVisitor& v) const { |
233 | 304 | v.beginVisit(*this); | 304 | v.beginVisit(*this); |
234 | 305 | 305 | ||
235 | 306 | std::vector<PlanIter_t>::const_iterator lIter = theChildren.begin(); | 306 | std::vector<PlanIter_t>::const_iterator lIter = theChildren.begin(); |
236 | @@ -312,17 +312,17 @@ | |||
237 | 312 | v.endVisit(*this); | 312 | v.endVisit(*this); |
238 | 313 | } | 313 | } |
239 | 314 | 314 | ||
248 | 315 | TokenizeIterator::~TokenizeIterator() {} | 315 | TokenizeNodeIterator::~TokenizeNodeIterator() {} |
249 | 316 | 316 | ||
250 | 317 | TokenizeIteratorState::TokenizeIteratorState() {} | 317 | TokenizeNodeIteratorState::TokenizeNodeIteratorState() {} |
251 | 318 | 318 | ||
252 | 319 | TokenizeIteratorState::~TokenizeIteratorState() {} | 319 | TokenizeNodeIteratorState::~TokenizeNodeIteratorState() {} |
253 | 320 | 320 | ||
254 | 321 | 321 | ||
255 | 322 | void TokenizeIteratorState::reset(PlanState& planState) { | 322 | void TokenizeNodeIteratorState::reset(PlanState& planState) { |
256 | 323 | PlanIteratorState::reset(planState); | 323 | PlanIteratorState::reset(planState); |
257 | 324 | } | 324 | } |
259 | 325 | // </TokenizeIterator> | 325 | // </TokenizeNodeIterator> |
260 | 326 | 326 | ||
261 | 327 | #endif | 327 | #endif |
262 | 328 | #ifndef ZORBA_NO_FULL_TEXT | 328 | #ifndef ZORBA_NO_FULL_TEXT |
263 | 329 | 329 | ||
264 | === modified file 'src/runtime/full_text/pregenerated/ft_module.h' | |||
265 | --- src/runtime/full_text/pregenerated/ft_module.h 2012-05-08 23:49:22 +0000 | |||
266 | +++ src/runtime/full_text/pregenerated/ft_module.h 2012-05-17 22:48:19 +0000 | |||
267 | @@ -455,20 +455,20 @@ | |||
268 | 455 | * | 455 | * |
269 | 456 | * Author: | 456 | * Author: |
270 | 457 | */ | 457 | */ |
272 | 458 | class TokenizeIteratorState : public PlanIteratorState | 458 | class TokenizeNodeIteratorState : public PlanIteratorState |
273 | 459 | { | 459 | { |
274 | 460 | public: | 460 | public: |
275 | 461 | store::Item_t doc_item_; // | 461 | store::Item_t doc_item_; // |
276 | 462 | FTTokenIterator_t doc_tokens_; // | 462 | FTTokenIterator_t doc_tokens_; // |
277 | 463 | 463 | ||
279 | 464 | TokenizeIteratorState(); | 464 | TokenizeNodeIteratorState(); |
280 | 465 | 465 | ||
282 | 466 | ~TokenizeIteratorState(); | 466 | ~TokenizeNodeIteratorState(); |
283 | 467 | 467 | ||
284 | 468 | void reset(PlanState&); | 468 | void reset(PlanState&); |
285 | 469 | }; | 469 | }; |
286 | 470 | 470 | ||
288 | 471 | class TokenizeIterator : public NaryBaseIterator<TokenizeIterator, TokenizeIteratorState> | 471 | class TokenizeNodeIterator : public NaryBaseIterator<TokenizeNodeIterator, TokenizeNodeIteratorState> |
289 | 472 | { | 472 | { |
290 | 473 | protected: | 473 | protected: |
291 | 474 | store::Item_t token_qname_; // | 474 | store::Item_t token_qname_; // |
292 | @@ -478,20 +478,20 @@ | |||
293 | 478 | store::Item_t value_qname_; // | 478 | store::Item_t value_qname_; // |
294 | 479 | store::Item_t ref_qname_; // | 479 | store::Item_t ref_qname_; // |
295 | 480 | public: | 480 | public: |
297 | 481 | SERIALIZABLE_CLASS(TokenizeIterator); | 481 | SERIALIZABLE_CLASS(TokenizeNodeIterator); |
298 | 482 | 482 | ||
301 | 483 | SERIALIZABLE_CLASS_CONSTRUCTOR2T(TokenizeIterator, | 483 | SERIALIZABLE_CLASS_CONSTRUCTOR2T(TokenizeNodeIterator, |
302 | 484 | NaryBaseIterator<TokenizeIterator, TokenizeIteratorState>); | 484 | NaryBaseIterator<TokenizeNodeIterator, TokenizeNodeIteratorState>); |
303 | 485 | 485 | ||
304 | 486 | void serialize( ::zorba::serialization::Archiver& ar); | 486 | void serialize( ::zorba::serialization::Archiver& ar); |
305 | 487 | 487 | ||
307 | 488 | TokenizeIterator( | 488 | TokenizeNodeIterator( |
308 | 489 | static_context* sctx, | 489 | static_context* sctx, |
309 | 490 | const QueryLoc& loc, | 490 | const QueryLoc& loc, |
310 | 491 | std::vector<PlanIter_t>& children) | 491 | std::vector<PlanIter_t>& children) |
311 | 492 | ; | 492 | ; |
312 | 493 | 493 | ||
314 | 494 | virtual ~TokenizeIterator(); | 494 | virtual ~TokenizeNodeIterator(); |
315 | 495 | 495 | ||
316 | 496 | public: | 496 | public: |
317 | 497 | void initMembers(); | 497 | void initMembers(); |
318 | 498 | 498 | ||
319 | === modified file 'src/runtime/spec/full_text/ft_module.xml' | |||
320 | --- src/runtime/spec/full_text/ft_module.xml 2012-05-08 23:49:22 +0000 | |||
321 | +++ src/runtime/spec/full_text/ft_module.xml 2012-05-17 22:48:19 +0000 | |||
322 | @@ -167,7 +167,7 @@ | |||
323 | 167 | </zorba:state> | 167 | </zorba:state> |
324 | 168 | </zorba:iterator> | 168 | </zorba:iterator> |
325 | 169 | 169 | ||
327 | 170 | <zorba:iterator name="TokenizeIterator" | 170 | <zorba:iterator name="TokenizeNodeIterator" |
328 | 171 | generateResetImpl="true" | 171 | generateResetImpl="true" |
329 | 172 | generateSerialize="false" | 172 | generateSerialize="false" |
330 | 173 | generateConstructor="false" | 173 | generateConstructor="false" |
331 | 174 | 174 | ||
332 | === modified file 'src/runtime/visitors/pregenerated/planiter_visitor.h' | |||
333 | --- src/runtime/visitors/pregenerated/planiter_visitor.h 2012-05-08 23:49:22 +0000 | |||
334 | +++ src/runtime/visitors/pregenerated/planiter_visitor.h 2012-05-17 22:48:19 +0000 | |||
335 | @@ -227,7 +227,7 @@ | |||
336 | 227 | class ThesaurusLookupIterator; | 227 | class ThesaurusLookupIterator; |
337 | 228 | #endif | 228 | #endif |
338 | 229 | #ifndef ZORBA_NO_FULL_TEXT | 229 | #ifndef ZORBA_NO_FULL_TEXT |
340 | 230 | class TokenizeIterator; | 230 | class TokenizeNodeIterator; |
341 | 231 | #endif | 231 | #endif |
342 | 232 | #ifndef ZORBA_NO_FULL_TEXT | 232 | #ifndef ZORBA_NO_FULL_TEXT |
343 | 233 | class TokenizerPropertiesIterator; | 233 | class TokenizerPropertiesIterator; |
344 | @@ -951,8 +951,8 @@ | |||
345 | 951 | virtual void endVisit ( const ThesaurusLookupIterator& ) = 0; | 951 | virtual void endVisit ( const ThesaurusLookupIterator& ) = 0; |
346 | 952 | #endif | 952 | #endif |
347 | 953 | #ifndef ZORBA_NO_FULL_TEXT | 953 | #ifndef ZORBA_NO_FULL_TEXT |
350 | 954 | virtual void beginVisit ( const TokenizeIterator& ) = 0; | 954 | virtual void beginVisit ( const TokenizeNodeIterator& ) = 0; |
351 | 955 | virtual void endVisit ( const TokenizeIterator& ) = 0; | 955 | virtual void endVisit ( const TokenizeNodeIterator& ) = 0; |
352 | 956 | #endif | 956 | #endif |
353 | 957 | #ifndef ZORBA_NO_FULL_TEXT | 957 | #ifndef ZORBA_NO_FULL_TEXT |
354 | 958 | virtual void beginVisit ( const TokenizerPropertiesIterator& ) = 0; | 958 | virtual void beginVisit ( const TokenizerPropertiesIterator& ) = 0; |
355 | 959 | 959 | ||
356 | === modified file 'src/runtime/visitors/pregenerated/printer_visitor.cpp' | |||
357 | --- src/runtime/visitors/pregenerated/printer_visitor.cpp 2012-05-08 23:49:22 +0000 | |||
358 | +++ src/runtime/visitors/pregenerated/printer_visitor.cpp 2012-05-17 22:48:19 +0000 | |||
359 | @@ -1412,18 +1412,18 @@ | |||
360 | 1412 | 1412 | ||
361 | 1413 | #endif | 1413 | #endif |
362 | 1414 | #ifndef ZORBA_NO_FULL_TEXT | 1414 | #ifndef ZORBA_NO_FULL_TEXT |
366 | 1415 | // <TokenizeIterator> | 1415 | // <TokenizeNodeIterator> |
367 | 1416 | void PrinterVisitor::beginVisit ( const TokenizeIterator& a) { | 1416 | void PrinterVisitor::beginVisit ( const TokenizeNodeIterator& a) { |
368 | 1417 | thePrinter.startBeginVisit("TokenizeIterator", ++theId); | 1417 | thePrinter.startBeginVisit("TokenizeNodeIterator", ++theId); |
369 | 1418 | printCommons( &a, theId ); | 1418 | printCommons( &a, theId ); |
370 | 1419 | thePrinter.endBeginVisit( theId ); | 1419 | thePrinter.endBeginVisit( theId ); |
371 | 1420 | } | 1420 | } |
372 | 1421 | 1421 | ||
374 | 1422 | void PrinterVisitor::endVisit ( const TokenizeIterator& ) { | 1422 | void PrinterVisitor::endVisit ( const TokenizeNodeIterator& ) { |
375 | 1423 | thePrinter.startEndVisit(); | 1423 | thePrinter.startEndVisit(); |
376 | 1424 | thePrinter.endEndVisit(); | 1424 | thePrinter.endEndVisit(); |
377 | 1425 | } | 1425 | } |
379 | 1426 | // </TokenizeIterator> | 1426 | // </TokenizeNodeIterator> |
380 | 1427 | 1427 | ||
381 | 1428 | #endif | 1428 | #endif |
382 | 1429 | #ifndef ZORBA_NO_FULL_TEXT | 1429 | #ifndef ZORBA_NO_FULL_TEXT |
383 | 1430 | 1430 | ||
384 | === modified file 'src/runtime/visitors/pregenerated/printer_visitor.h' | |||
385 | --- src/runtime/visitors/pregenerated/printer_visitor.h 2012-05-08 23:49:22 +0000 | |||
386 | +++ src/runtime/visitors/pregenerated/printer_visitor.h 2012-05-17 22:48:19 +0000 | |||
387 | @@ -348,8 +348,8 @@ | |||
388 | 348 | #endif | 348 | #endif |
389 | 349 | 349 | ||
390 | 350 | #ifndef ZORBA_NO_FULL_TEXT | 350 | #ifndef ZORBA_NO_FULL_TEXT |
393 | 351 | void beginVisit( const TokenizeIterator& ); | 351 | void beginVisit( const TokenizeNodeIterator& ); |
394 | 352 | void endVisit ( const TokenizeIterator& ); | 352 | void endVisit ( const TokenizeNodeIterator& ); |
395 | 353 | #endif | 353 | #endif |
396 | 354 | 354 | ||
397 | 355 | #ifndef ZORBA_NO_FULL_TEXT | 355 | #ifndef ZORBA_NO_FULL_TEXT |
398 | 356 | 356 | ||
399 | === renamed file 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-1.xml.res' => 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-node-1.xml.res' | |||
400 | === renamed file 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-2.xml.res' => 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-node-2.xml.res' | |||
401 | === renamed file 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-3.xml.res' => 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-node-3.xml.res' | |||
402 | === renamed file 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-4.xml.res' => 'test/rbkt/ExpQueryResults/zorba/fulltext/ft-module-tokenize-node-4.xml.res' | |||
403 | === renamed file 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-1.xq' => 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-1.xq' | |||
404 | --- test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-1.xq 2012-05-08 17:24:54 +0000 | |||
405 | +++ test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-1.xq 2012-05-17 22:48:19 +0000 | |||
406 | @@ -2,7 +2,7 @@ | |||
407 | 2 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; | 2 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; |
408 | 3 | 3 | ||
409 | 4 | let $doc := <msg>hello, world</msg> | 4 | let $doc := <msg>hello, world</msg> |
411 | 5 | let $tokens := ft:tokenize( $doc, xs:language("en") ) | 5 | let $tokens := ft:tokenize-node( $doc, xs:language("en") ) |
412 | 6 | let $t1 := validate { $tokens[1] } | 6 | let $t1 := validate { $tokens[1] } |
413 | 7 | let $t2 := validate { $tokens[2] } | 7 | let $t2 := validate { $tokens[2] } |
414 | 8 | 8 | ||
415 | 9 | 9 | ||
416 | === renamed file 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-2.xq' => 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-2.xq' | |||
417 | --- test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-2.xq 2012-05-05 11:37:42 +0000 | |||
418 | +++ test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-2.xq 2012-05-17 22:48:19 +0000 | |||
419 | @@ -1,9 +1,8 @@ | |||
420 | 1 | import module namespace ft = "http://www.zorba-xquery.com/modules/full-text"; | 1 | import module namespace ft = "http://www.zorba-xquery.com/modules/full-text"; |
421 | 2 | |||
422 | 3 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; | 2 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; |
423 | 4 | 3 | ||
424 | 5 | let $doc := <msg xml:lang="es">hola, mundo</msg> | 4 | let $doc := <msg xml:lang="es">hola, mundo</msg> |
426 | 6 | let $tokens := ft:tokenize( $doc ) | 5 | let $tokens := ft:tokenize-node( $doc ) |
427 | 7 | let $t1 := validate { $tokens[1] } | 6 | let $t1 := validate { $tokens[1] } |
428 | 8 | let $t2 := validate { $tokens[2] } | 7 | let $t2 := validate { $tokens[2] } |
429 | 9 | 8 | ||
430 | 10 | 9 | ||
431 | === renamed file 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-3.xq' => 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-3.xq' | |||
432 | --- test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-3.xq 2012-05-05 16:28:22 +0000 | |||
433 | +++ test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-3.xq 2012-05-17 22:48:19 +0000 | |||
434 | @@ -4,7 +4,7 @@ | |||
435 | 4 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; | 4 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; |
436 | 5 | 5 | ||
437 | 6 | let $x := <p xml:lang="en">Houston, we have a <em>problem</em>!</p> | 6 | let $x := <p xml:lang="en">Houston, we have a <em>problem</em>!</p> |
439 | 7 | let $tokens := ft:tokenize( $x ) | 7 | let $tokens := ft:tokenize-node( $x ) |
440 | 8 | let $node-ref := (validate { $tokens[5] })/@node-ref | 8 | let $node-ref := (validate { $tokens[5] })/@node-ref |
441 | 9 | let $node := ref:node-by-reference( $node-ref ) | 9 | let $node := ref:node-by-reference( $node-ref ) |
442 | 10 | return $node instance of text() | 10 | return $node instance of text() |
443 | 11 | 11 | ||
444 | === renamed file 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-4.xq' => 'test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-4.xq' | |||
445 | --- test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-4.xq 2012-05-05 16:28:22 +0000 | |||
446 | +++ test/rbkt/Queries/zorba/fulltext/ft-module-tokenize-node-4.xq 2012-05-17 22:48:19 +0000 | |||
447 | @@ -4,7 +4,7 @@ | |||
448 | 4 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; | 4 | import schema namespace fts = "http://www.zorba-xquery.com/modules/full-text"; |
449 | 5 | 5 | ||
450 | 6 | let $x := <msg xml:lang="en" content="Houston, we have a problem!"/> | 6 | let $x := <msg xml:lang="en" content="Houston, we have a problem!"/> |
452 | 7 | let $tokens := ft:tokenize( $x/@content ) | 7 | let $tokens := ft:tokenize-node( $x/@content ) |
453 | 8 | let $node-ref := (validate { $tokens[5] }) /@node-ref | 8 | let $node-ref := (validate { $tokens[5] }) /@node-ref |
454 | 9 | let $node := ref:node-by-reference( $node-ref ) | 9 | let $node := ref:node-by-reference( $node-ref ) |
455 | 10 | return $node instance of attribute(content) | 10 | return $node instance of attribute(content) |
Validation queue starting for merge proposal. zorbatest. lambda. nu:8080/ remotequeue/ feature- ft_module- 2012-05- 17T21-57- 06.963Z/ log.html
Log at: http://