> so where would the logic of "apply if changed" be placed?
In the latest iteration of this branch our "apply if change logic" will require some followup work.
I think we need a branch that will decompose get_data into the following separate methods:
1. ds.detect() - speedy/low-cost python logic performs ds-identify determine if the environment is not a match for this datasource so it can get out fast
2. ds.crawl_metadata(): perform a read-only walk (not affecting the datasource caches) of all related instance data sources returning a dict
3. ds.get_data() renamed to ds.process_data(crawled_data):
a. will take crawled_metadata dictionary which will be processed and persisted as instance attributes metadata, vendordata, userdata.
b. write instance-data.json
c. cache crawled_data as ds._crawled_data for future comparison by metadata_changed method
4. ds.metadata_changed(crawled_data): compares crawled_data to internal cached crawled_data returning a list of keys which changed content
update_metadata will then look like this:
process_data
if supported_events:
if hasattr(self, 'detect'):
if not ds.detect(): return False crawled_data = self.crawl_metadata()
if self.metadata_changed(crawled_data): result = self.process_data(crawled_data)
> so where would the logic of "apply if changed" be placed?
In the latest iteration of this branch our "apply if change logic" will require some followup work.
I think we need a branch that will decompose get_data into the following separate methods: metadata( ): perform a read-only walk (not affecting the datasource caches) of all related instance data sources returning a dict data(crawled_ data): changed( crawled_ data): compares crawled_data to internal cached crawled_data returning a list of keys which changed content
1. ds.detect() - speedy/low-cost python logic performs ds-identify determine if the environment is not a match for this datasource so it can get out fast
2. ds.crawl_
3. ds.get_data() renamed to ds.process_
a. will take crawled_metadata dictionary which will be processed and persisted as instance attributes metadata, vendordata, userdata.
b. write instance-data.json
c. cache crawled_data as ds._crawled_data for future comparison by metadata_changed method
4. ds.metadata_
update_metadata will then look like this:
process_data
return False
crawled_ data = self.crawl_ metadata( ) changed( crawled_ data):
result = self.process_ data(crawled_ data)
if supported_events:
if hasattr(self, 'detect'):
if not ds.detect():
if self.metadata_