Data Archiving in Dynamics AX – Creation and Best Practices

Data Archiving in Dynamics AX – Creation and Best Practices

Data Archiving in Dynamics AX - Creation and Best Practices

Data Archiving in AX  is typically performed by copying data in highly used AX tables into newly created ‘History’ tables.

Data Archiving in AX is an easy to build solution to issues of table size and slow data queries.  By data archiving tables are allowed to be kept small while older data is still readily available for reporting, or research.  However if done improperly the task can be overly cumbersome and tax system resources heavily or potentially cause data integrity issues.

To prevent these issues here are some ideas to consider:

 

Batch Job vs Job

Typically the actual archiving is best achieved by either batch jobs or AOT jobs as data archiving is usually performed highly infrequently or even a single time and doesn’t need to be tied into any other functionality in the system.

Batch Jobs

Should be utilized in most cases, data archiving is typically done on some timetable based around system downtime (i.e. once every 3 months archive the data) which batch jobs allow to be setup at time of creation.  Or altered based on what is needed, a job requires a manual intervention from a developer any time it needs to be run.
MSDN article on creating batch jobs: (https://msdn.microsoft.com/en-us/library/cc636647.aspx)

Jobs

Should only be utilized when archiving requires a developers intervention each time or if the archiving only occurs once ever.  This process is much more manual then the batch job approach which should be considered when deciding to data archive.
AOT > Jobs > Right Click > new job

 

Creating History Tables

The history tables are used to store historical data.  They should match the values of the table exactly, below is the best way of achieving this.
Duplicating tables (http://daxdude.blogspot.com/2013/03/ax-tables-copy-all-table-data-to.html) step 1 of article.  This table should then be renamed to however you best remember that the purpose of this table is to store the historical data of the table.  I typically use ‘$TableName$History’.

This is the best process to ensure that the tables are identical between the history and original tables.

 

Populating History Tables: insert_recordset vs while select insert

Population of the history tables will require copying the records from one table into another, this can be performed in one of two ways:

insert_recordset

Since data archiving revolves around the concept of copying large amounts of data and records from one table to another an insert_recordset is ideal because it makes much fewer calls to the database as opposed to a while select.  It is also possible to update multiple records with a single command.  Will be much faster than the while select insert option.

while select with insert command

This ideally will never be needed to be utilized during data archiving, but if this is the only option realize that performance of the data archiving operation will suffer.
Example & Source: (https://msdn.microsoft.com/en-us/library/aa635694.aspx)

 

 

Purging data tables: delete_from complications

Purging of the data tables should be performed after the population of the history tables.  This ensures that if population fails that we can error out without simply having data deleted. In general a delete_from will perform this operation better than a while select delete operation.

 

delete_from

delete from is much better suited to deleting large numbers of records from tables then a while select delete operation.  This is because delete_from makes fewer calls to the database just like insert_recordset, when compared to the while select delete counterpart.

However, if a delete method or trigger exists on the table trying to purge records from the speed of delete_from is the same as a while select.  This is because in those cases each record needs to be evaluated.  When archiving rarely if ever do we want to evaluate the delete methods or triggers as those methods usually have a cascading effect to other data that correspond to AX processes that can possibly remove records from the table.

To prevent this:
Call your table buffer before delete_from with these commands:
tableBuffer.skipDeleteMethods(true);
tableBuffer.skipDeleteTriggers(true);

This will ensure the delete_from operates as intended and that the archiving of data does not affect more data than it should in AX.
Source: (https://www.mail-archive.com/axapta-knowledge-village@yahoogroups.com/msg21296.html)

Sudo-coded ex with a commonly used WHS table – typically done in the run command of the batch job class


run()

{

WHSDimTracking                              whsDimTracking

WHSDimTrackingHistory               whsDimTrackingHistory // new history table

 

ttsBegin;

insert_recordset whsDimTrackingHistory (field-list)

Select (field-list) from whsDimTracking

where …

— exists joins are also possible here

 

whsDimTracking. skipDeleteMethods(true);

whsDimTracking.skipDeleteTriggers(true);

 

delete_from whsDimTracking

where ..

— exists joins are also possible here

 

ttsCommit;

}


 

 

 

 

Share on Google+Tweet about this on TwitterShare on LinkedInShare on FacebookEmail this to someone