Dynamics Ax Technical

Dynamics Ax 2012 History cleanup

On of my most popular posts is Cleaning up the AIF document log, but this is not the only table that could benefit from a regular cleanup. For some of these standard tables there are little or no cleanup jobs, as an example the cleanup for the AIF document log only runs online in the client without any option to schedule it in batch.

I know that a lot of partners and customers use SQL scripts (as I did for the AIF post) to delete this data but there are some things to keep in mind:

  • Pro:
    • Very fast.
    • No new release of models needed.
  • Cons:
    • Deleting a big volume of data might cause more locks and the database log file to expand where the disk might run out of space.
    • All business logic is skipped and new customizations might be ignored. For example: a new delete action causing orphaned data.

Because of these reasons I started thinking about building a simple framework that is easy to extend, can be limited in the amount of data so database transactions and expansion of the log file is limited, and of course can be scheduled in batch.

So here’s the result:

  • Type: This Enum is what makes the stuff easy to extend, the classes that do the processing use the extension framework to execute the correct logic.
  • Number of days: This parameters defines the retention in a number of days.
  • Number of records in transaction: the maximum number of records that will be deleted in one database transaction. If your SQL Server is configured to use lock escalation selecting a big amount of data could cause a table lock which will stop all other processes on the same table.
  • Number of bulks: One transaction is one bulk. This is very useful to not over flood the database logging system. For example: if a database log transaction backup runs every hour you could schedule the cleanup to run hourly for a maximum amount of data. If the backup is finished the log file is freed up again and the cleanup can run once more.

So with this example I already provide 3 of the most used scenarios with standard Ax:

  • Batch history: This job cleans the BatchJobHistory and related (delete actions) tables with the following ranges:
    • CreatedDatetime: Older then the number of days.
    • Status: Ended.
  • AIF logging: This job cleans the AifMessageLog and related (delete actions) tables with the following ranges:
    • CreatedDatetime: Older then the number of days.
    • Status: Processed.
  • Database logging: This job cleans the  SysDataBaseLog table with the following ranges:
    • CreatedDatetime: Older then the number of days.

If you want to extend this with other scripts for new tables all you have to do is this:

  • Add your new type to the BLOGHistoryCleanupType enum.
  • Make a new class that uses the BLOGHistoryCleanupAttribute with this enum value, inherit from BLOGHistoryCleanupProcessorBase and implement the run method.

The source is below, enjoy!


(This tool has only been tested on a Dynamics Ax 2012 R3 CU12 environment, please test this before putting in to a production environment and use at your own risk)

Dynamics Ax Technical

Dynamics Ax 2012 my ideal Azure VM setup


This is just a quick note on how I setup an Ax 2012 on an Azure machine to get the most bang for my buck. The example I’m using is a DEV machine where I keep the sample code for this blog. But you could apply the principles for every environment.

Sizing & disks

Since I’m on a limited budget with my Visual Studio enterprise subscription I don’t use premium storage. So my favorites are the D2/D3 and D11 v2 machines, especially the D11 which has more memory, less cores (compared to D2/D3), it can have 4 data disks and the temporary local storage is larger than DS machines. This storage will become very useful for SQL Server and the extra memory is also a must.

Since standard disks are limited to 500 IOPS and 60 MB/s throughput I like to attach multiple disks and spread my installations over these. So I did my installation like this:

  • Disk 1: SQL service + SSMS + LDF files + backup folder.
  • Disk 2: SQL MDF files.
  • Disk 3: Dynamics Ax client and service + visual studio.

(You can even add a 4th disk on this machine size but I didn’t really need it.)

SQL Server

Another neat performance trick is to use the temporary local SSD for the SQL temp db and buffer pool extension file. Check the link below on how to do it, I also have modified their powershell startup script to start my AOS.


# Customize the service names to your installation
$SQLService = 'SQL Server (MSSQLSERVER)'
$SQLAgentService = 'SQL Server Agent (MSSQLSERVER)'
$AXService = 'Microsoft Dynamics AX Object Server 6.3$01-AX2012R3'
if (!(test-path -path $tempfolder)) {
    New-Item -ItemType directory -Path $tempfolder
Start-Service $SQLService
Start-Service $SQLAgentService
Start-Service $AXService


I prefer starting and stopping my machine manually but another good way is to use Azure Automation to automatically start and stop your machine when you are not using it. This will save a lot of money on your bill.

If this is too much trouble but you’re afraid that you might forget to shutdown your VM you can also use the auto-shutdown feature. This feature can be found on the menu of every VM.

Dynamics Ax Technical

Dynamics Ax 2012 fill factor

Database synchronize

We all know that Dynamics Ax constructs its tables and indexes during a DB synchronize, and if we would make changes directly in SQL these could be overridden by Ax when installing model updates and syncing those to the database. But there’s more to creating an index then the defaults Ax does and in my experience the fill factor is never specified or the SQL Server default fill factor is used.

Wait what fill factor?!

If this is your first question you might want to read this 🙂 After reading this you should come to the conclusion that on certain tables we might need a proper fill factor to lower IO on the subsystem.

What options do we have?

  • Maintenance: With the standard SQL maintenance jobs or tools like the one from Ola Hallengren we can specify one general fill factor. But setting one value for all tables might be more of a problem then don’t setting it.
  • Scripting: We can script something to change these after a maintenance but execution might be forgotten, indexes might be renamed, …
  • Ax: What if we can setup Ax so it wouldn’t overwrite our settings?

The right way

We can setup a fill factor by going to the System Administrator module -> Periodic -> Database -> SQL Administration. This screen uses a treeview to define some extra SQL instructions like the fill factor per table or per index.

But since this screen uses a treeview with way too much data it’s a nightmare to work in, you cant load excel sheets and all the buttons execute long running processes online on the client tier without any possibility to run it in batch. That’s why I decided to write my own solution for this problem. (Source on the bottom of the post)

This screen is a sort of automated frontend for what the screen with the treeview does and consists out of three steps. (When no indexes are specified for a table the logic will run for all indexes.)

  1. Initialize: This service fills the setup table based on the table group for instance transaction tables could benefit from using this. (When in real life situations I use another method based on SQL DMV’s explained below)
  2. Process: This service process our setup to the standard Ax (kernel) tables.
  3. Reindex: This service reindexes and also sets the fill factor on the indexes we have specified in our setup.

Finding the right fill factor

Other then my first idea to base my setup on the table group I would rather use a better way. If you measure a high amount of page splits, especially during business hours you want to find out which tables cause them.

In the blog posts linked below you can find good tips on how to find indexes that need tuning on this. I usually start with setting the fill factor to 95% and work my way from there. (Make sure these are tables with frequent updates or delete scenarios because when you have a table with only inserts and a clustered index only adding at the back of the index setting a fill factor might not be so useful. Hence setting a fill factor on a recid index is probably not such a good idea)


(This tool has only been tested on a Dynamics Ax 2012 R3 CU12 environment, please test this before putting in to a production environment and use at your own risk)

Dynamics Ax Technical

Dynamics Ax 2012 Trace parser with ETW

Hi All,

The following subject already exists for a while but I found it so useful that I want to get it out there once more!

Lets consider the following scenario: There’s a performance issue in production and no other environment around where the customer or the partner is able to reproduce it or the database is just too big. (Yes I know don’t comment on this statement 🙂 ) The last thing you want to do is to put this environment even under more load by installing tools or writing extra code just to log something so maybe this is the solution for you.

When you download Performance Analyzer for Dynamics Ax  it comes with all kinds of SQL goodies but also with 2 Performance Monitor templates (AX_Trace_Detail.xml and AX_Trace_ClientAccessOnly.xml) which you can find in the “DynamicsPerf\Windows Perfmon Scripts” folder. As the filenames suggest these allow you to do tracing for Ax. But wait we already have that in the tracing cockpit of the client? Yes we do but it doesn’t allow you to schedule it which is really handy to investigate a batch process running at night in production and let’s be honest do you trust starting this on an Ax client in a production environment?

So here’s how to set it up:

  • Open the Windows Performance Monitor.
  • Create a new Data Collector Set.
  • Use a template to create the Data Collector Set.
  • Click Browse and pick one of the templates delivered by the Performance Analyzer solution. AX_Trace_Detail.xml or AX_Trace_ClientAccessOnly.xml. This last one is useful for putting on RDP or Citrix servers to reduce the amount of logging.
  • Change the default “%systemdrive%\PerfLogs\Admin\DAX 2012 Trace” path to another drive which has enough space and avoid using the same drive as Dynamics Ax so that when it runs full the application will still run stable.
  • After this you can use the standard functionality to set up schedules, disk usage, copy’s, …
  • Start and stop the tracing and import the .etl file in the Dynamics Ax 2012 Trace Parser.

It’s just that simple and this feature is available on every Windows installation. Just be aware that this generates a high volume of logs so you want to set up the data manager really good. Processing it in the trace parser also might take a long time so I suggest you always to this on another environment than production.

Happy tracing!

Dynamics Ax Functional

Enterprise portal: work items for Purchase orders

Hi all,

Those of you who work with workflow in the Procurement flow might have encountered the following problem. The client wants to approve Purchase Orders through the Enterprise Portal, but for some reason the work items are not visible, while they are visible in AX Client.

Fair enough, you’d say, there actually is an option in the workflow to enable or disable this from the Enterprise Portal. Weird enough, the checkbox is checked and there is no other reason why actions on work items would be blocked.

The problem is an actual standard Microsoft bug.

There is a difference in workflow elements that can be used, ‘Approve purchase order’ and ‘Approve purchase order, editable’. The latter makes it possible for reviewers to edit the Purchase Order they need to approve. The former obviously doesn’t. Only when the latter is used, work item actions become unavailable in the Enterprise portal.

Open AOT>Workflow>Approvals

You will find 2 approval elements: PurchTableApproval and PurchTableApprovalEdit


If we compare these two elements:


It is clear that the ‘ActionWebMenuItem’ is missing in the properties of the ‘PurchTableApprovalEdit’, for Approve/Reject/RequestChange. To solve our problem, we will simply need to fill these in on the Edit-element:

Approve: EPPurchTableApprovalApprove
Reject: EPPurchTableApprovalReject
RequestChange: EPPurchTableApprovalRequestChange

On the headnode we’ll need to fill out three missing fields as well:


DocumentWebMenuItem: EPPurchTableInfo
ResubmitWebMenuItem: EPPurchTableWorkflowReSubmit
DelegateWebMenuItem: EPPurchTableApprovalDelegate

This will fix the work item problem! Microsoft has stated that workflow functionality will change in AX7 and so this issue will not be resolved for AX2012.

Dynamics Ax Programming Technical

Dynamics Ax Management Shell powershell tips

Hi all,

When Dynamics Ax 2012 came out it was shipped with 2 tools AxUtil.exe and the Management shell. In my experience most of the Dynamics Ax developers are already familiar with AxUtil.exe but don’t have much experience in powershell yet. Therefor I decided to write some examples to get you guys going. If you have any questions or a request please leave a comment, I might also add some scripts as I go.

  1. # Load the Management utilities in to a standard powershell session
  2. # If you use the management shell you don't need this
  3. $dynamicsSetupRegKey = Get-Item "HKLM:\SOFTWARE\Microsoft\Dynamics\6.0\Setup"
  4. $sourceDir = $dynamicsSetupRegKey.GetValue("InstallDir")
  5. $dynamicsAXUtilPath = Join-path $sourceDir "ManagementUtilities\Microsoft.Dynamics.ManagementUtilities.ps1"
  6. .$dynamicsAXUtilPath
  8. # List all the models like the AxUtil does
  9. Get-AXModel  | Sort-Object ModelId | Format-Table -Property ModelId,Layer,Name,DisplayName,Version -AutoSize
  11. # List all the models like the AxUtil does but filter out the Sys and Syp
  12. Get-AXModel | Where {$_.Layer -NotLike "Sy*"}  | Sort-Object ModelId | Format-Table -Property ModelId,Layer,Name,DisplayName,Version -AutoSize
  14. # Get all elements in a model file
  15. $models = Get-AXModel -File C:\Temp\model.axmodel -Details
  16. $models.Elements | Format-Table -Property Path
C# .NET Dynamics Ax Programming Technical

Dynamics Ax custom WCF service with paging support

Hi all,

Lately I’ve been busy developing WCF services to communicate with .NET web applications. All of these web services are custom-made and are using .NET data contracts so that every application uses the same contracts. Due to the high amount of data and performance we had to implement some kind of paging. I had no clue that Ax even has paging support but it does and it does this with properties on the QueryRun objects.

For example purposes I’ve made a service which uses .NET request and response contracts. I prefer this way over X++ data contracts because this is more reusable and flexible on the client side. The code is self-explanatory to me but you can always pose questions of course. 😉

The request contract:

public class ItemListRequest
    public long StartingPosition { get; set; }
    public long NumberOfRecordsToFetch { get; set; }

The response contract:

public class ItemListResponse
    public int TotalNumberOfRecords { get; set; }
    public ArrayList Items { get; set; }
public class Item
    public string Id { get; set; }
    public string Name { get; set; }

The service implementation:

public Blog.WCFPaging.DataContracts.ItemListResponse getItems(Blog.WCFPaging.DataContracts.ItemListRequest  _request)
    Blog.WCFPaging.DataContracts.Item               item;
    System.Collections.ArrayList                    itemList    = new System.Collections.ArrayList();
    Blog.WCFPaging.DataContracts.ItemListResponse   response    = new Blog.WCFPaging.DataContracts.ItemListResponse();
    QueryRun        queryRun    = new QueryRun(queryStr(InventTable));
    InventTable     inventTable;
    if(     CLRInterop::getAnyTypeForObject(_request.get_StartingPosition()) > 0
        &&  CLRInterop::getAnyTypeForObject(_request.get_NumberOfRecordsToFetch()) > 0)
        response.set_TotalNumberOfRecords(QueryRun::getQueryRowCount(queryRun.query(), maxInt()));
        queryRun.addPageRange(_request.get_StartingPosition(), _request.get_NumberOfRecordsToFetch());
        // At least one order by field should be declared when using paging
        SysQuery::findOrCreateDataSource(queryRun.query(), tableNum(InventTable)).addOrderByField(fieldNum(InventTable, ItemId));
        inventTable = queryRun.get(tableNum(InventTable));
        item        = new Blog.WCFPaging.DataContracts.Item();
    return response;

Calling the service from a .NET application:

int pageSize = 10;
using (var client = new BLOGPagingServiceClient())
    BLOGPagingServiceGetItemsResponse response = null;
    var request = new ItemListRequest() { StartingPosition = 1, NumberOfRecordsToFetch = pageSize };
        response = client.getItems(new BLOGPagingServiceGetItemsRequest()
            CallContext = new CallContext(),
            _request = request
        foreach (Item item in response.response.Items)
            Console.WriteLine(String.Format("{0, -10} - {1}", item.Id, item.Name));
        request.StartingPosition += pageSize;
    while (response.response.Items.Count > 0);

Paging on a QueryRun is implement since Ax 2009, more info on paging:

I wonder why this isn’t implemented in the AIF services or is it? If anyone knows please leave a comment about it. 😉

%d bloggers like this: