Thursday 4 February 2016

Power BI - Embedded report with companies in Germany using Dynamics Ax


Hi guys,

From now it is possible to publish Power BI reports embedded in web applications so I take the opportunity to share with you all the updated list of companies in Germany that I have identificied so far using Microsoft Dynamics Ax. The information is not 100% reliable (for example the information about version of Dynamics Ax) but it will keep updated because this report is automatically updated.

Power BI is cool! :)

Wednesday 24 December 2014

Adding the Update operation to the Inventory Counting Journal Document Service




Hi guys, it has been for a while since last post. The year 2014 is comming to an  end and looking back it could be said that it was a really interesting year. I achieved to improve my German but also took on a new position and now I am working for one of the leading german clothing brands company with Microsoft Dynamics Ax 2012 R3 on the Retail Module.

Let's start with the purporse of this post. How to add the Update operation to the Inventory Counting Journal Document Service.

By default Microsoft has only provided the Create operation and it would be really interesting to have also the Read operation in order to create a Inventory Counting Journal on the system and send it through AIF but also the Update operation so we can receive back this Inventory through AIF in order to update the journal and post it.

The Inventory Counting Journal

We can find the Inventory Counting Journal on Inventory Management, Journals, Invent Counting, Counting. I think I have not to speak too much about it. The purporse of this post is to create manually a Inventory Counting Journal, send it out with AIF (We will add also the find and read operations) and receive the counted Inventory Counting Journal with AIF.




We have to setup the default Inventory Counting Journal for AIF on the Inventory and Warehouse Parametes, AIF Parameters section.




In order to update the standard Document Service Invent Counting Journal with the read, update and additional operations we use the Update document service  Wizard that we can find on Tools, Application Integration Framework, Update document service. 

Here provide we the name of the Service class InventCountingJournalService and we activate the service operations read, update, find, findKeys, getKeys and getChangedKeys. We activate also "Regenerate date object classes" and "Update AxBC classes". We click on OK and the Wizard will update the Document Service classes with the new operations.


We need also to update the Document Service Query AxdCountingJournal and set the "Update" value to "Yes" on the Query Data Sources InventJournalTable and InventJournalTrans because we want to be able to update data.

Also on the JoinMode of the InventDim datasource we change it to "OuterJoin" cause we want to be able to work with XML Messages which contain not info about the InventDim record. If we don't change the property, the Query will return nothing because the InnerJoin with the InventDim table.

Next, we need to add the new operations on the InventCountingJournalService Service. On the InventCountingJournalService, we make right click and in the node Operations we add the new operations.


Then, we have to refresh the service in order to update the new operation definitions. We make right click on the InventCountingJournalService, select Add-Ins and then on Register Service. Select the InventCountingJournalService and then click on Refresh in order to update the new operations on the service so they could be used on the outbound and inbound ports.


Next we need to check on the class Axd<Document> (AxdCountingJournal) and on the service class (InventCountingJournalService) if the corresponding operations methods are implemented.

On the AxdCountingJournal class we check the methods:
  1. findEntityKeyList
  2. findList
  3. getActionList
  4. read
  5. readList
  6. update
  7. updateList

On the InventCountingJournalService class we check the methods:


  1. find
  2. findKeys
  3. read
  4. update
In all these methods we sure make sure that there is not an error line code with the "Operation not implemented" message and in that case we should modify the code in order to implement a call to the base clase like this:


Next we make some code customizing on the Document Service classes in order to make possible the update operation with the Inventory Counting Journal.

In class AxdCountingJournal class we realice the following updates:

We add a new "createRecord" global variable to be used to control if we are creating a new record or only updating an existing one.




We modify the prepareInventJournalTrans method in order to have a code like the following:



On this code update, we check if the InventJournalTrans record already exists on system. If not, we set tup the createRecord global variable to true. Otherwise, we are updating the record and we take the actual value of the Qty field from the InventJournalTrans record because maybe this line was already counted (the Qty field have the difference between on hand and counted and is the posted qty on the InventTrans) and we also set up the proper InventDimId value on the InventJournalTrans record.

In the class AxdCountingJournal we modify the method prepareForSave with the following code in order to avoid the increase of the total number of lines in the journal. 




Configuration of the Outbound AIF Port

As we have also implemented the methods "Read" and "Find" we could create an outbound port in order to send out a XML message with the content of an inventory counting journal. We could use this message to send the Inventory Counting Journal to an external application which the proper Hash code that we need later when we build the incoming XML message.

Below the configuration of the outbound port





And the generated outbound XML Message:


Configuration of the Inbound AIF Port:

Below the configuration of an Inbound Port in order to be able to process incoming XML messages with the Update operation.




And the corresponding XML message with the Update operation:


Details about the update operation and implications with the Hash Code concept

Within the outbound and inbound messages finds itself the element “_DocumentHash”. When data is updated using AIF, the document hash is required. When data is read, AIF returns the document hash field “_DocumentHash”. This field contains a hash of all the RecId and RecVersion values for each record that is returned. In this example the records that are returned are a Invent Counting Table, the Invent Counting lines but also maybe the Invent Dimensions if we provide them.If the data change before the AIF response, the hash will be different and the response won´t be able to be processed correctly. Below a Microsoft link with an explanation of this concept:


Using the Microsoft Dynamics Ax Trace Parser to understand and be able to modify correctly the AIF artifacts

Understanding the AIF classes and where to add or modify code is not an easy task. There is not a lot of documentation about. A good source is the document "AIF Stuff" that we cand find here:


Another great resource to understand AIF classes is to use the Microsoft Dynamics Ax Trace Parser. We have only to generate a trace when we realice and update operation and study the code to understand what we have to do to properly implement the operation.

The trace could be generated on a Job that calls the AifInboundProcessingService in order to process manually the incoming message with the Update operation.


This generates us a trace file that we can open with the Microsoft Dynamics Trace Parser so we can start researching the executed code.



Possible AIF Exceptions that could appeard while testing the new Update operation on the Inventory Counting Journal

Lastly I want to provide a list of possible AIF Exceptions that could arrise while implementing the Update operation on the Document Service.


Counting journal outbound is not supported.

This message will appear if the corresponding operation (Method findEntityKeyList, findList, getActionList, read, readList, update or updateList in class AxdCountingJournal) is still not implemented on the class AxdCountingJournal (Methods findEntityKeyList, findList, getActionList, read, readList, update or updateList),  or in the class InventCountingJournalService (Methods find, findKeys, read or update)

There is no service with namespace = 'http://schemas.microsoft.com/dynamics/2008/01/services' and external name = 'InventCountingJournalService'.
This message will appear if we have not correctly written the service name on the incoming XML Message. A look on the AIFService table will provide us the correct service name.

Cannot process duplicate message. Message {A8EF6A9C-E84A-4FCB-8813-0DBCB34BE2FA} has already been processed and is different from the new message.
This message will appear if we provide an incoming message with a MessageId with a GUID already provided. We should provide always a new GUID or establishing the EnableIdempotence property of the InventCountingJournalService update operation to “No” if we want to provide repeated values for MessageId.

No default inventory counting journal name is specified in AIF parameters
We should provide a default journal name for inventory counting journal on the AIF parameters on the Inventory Management parameters. Not confuse with the inventory management parameters default inventory counting journal.

Inventory dimension Site is mandatory and must consequently be specified.
This exception happens because the service was only implemented for the “create” operation. On the AIF Document Service classes we should assure that on the update operation the InventJournalTrans get the right InventDim as described on this post.

The request failed with the following error:  The document hash does not match the hash in the XML. The document may have been changed since last read
With the “update” operation we should provide a document Hash on the incoming XML message. This Hash is calculated using the values RecId and RecVersion from all the records that are sent or received through AIF. We must be sure about which records are we sending back (InventJournalTable, InventJournalTrans and maybe also InventDim).

Cannot create a record in Inventory transactions originator (InventTransOrigin). Reference: Counting, 00031. The record already exists.

This error should happen because the service was first implemented only for the “create” operation and when an second incoming XML Message which reference to the same InventJournalTrans record arrives. Then the system tries to create again an InventTrans for the counted quantity. The problem it is solved with a little of code on the AIF Document Classes as stated on this post.



Ok that is it! It remains only to wish you all a Merry Christmas with your family and a happy new year! 2015 seems to be pretty amazing. In Germany lot of companies are moving to Microsoft Dynamics, specially the Retail companies and the next version of Microsoft Dynamics Ax with the exciting new HTML interface will make presence.

Ich wünsche euch Frohe Weihnachten!!!!


Saturday 25 May 2013

Creating a product master with variants with X++

Hi all, recently I had to do something tricky with the data import/export framework and now I want to share it :-)

I had to create product masters but also add configurations to the product dimensions and later release this products with it new configurations so they could be used on the released company.

The data import/export framework have so far a entity to create product masters but it seems that there is not an implementation to release product versions with new configurations. So what I did is to add a method to the class DMFProductMasterEntityClass and 2 new fiels on the source CSV file with the Configuration Name and the Configuration Description. When properly setup, the new method would be executed on the copying from staging to target using these 2 new values on the CSV.

And below a example code to add the configurations to a product master and release the master product but also the configuration.

EcoResDistinctProductVariant ecoResDistinctProductVariant;
EcoResProductVariantDimensionValue EcoResProductVariantDimensionValue;
RefRecId ecoResDistinctProductVariantRecId;
EcoResProductReleaseManagerBase releaseManager;

c
ontainer productDimensions;
EcoResProductNumber EcoResProductNumber = "MyProductNumber";
EcoResProduct EcoResProduct = EcoResProduct::findByProductNumber(EcoResProductNumber);
;

EcoResProductMasterManager::addProductDimensionValue( ecoResProduct.RecId,
EcoResProductDimensionAttribute::inventDimFieldId2DimensionAttributeRecId(

fieldNum(InventDim, ConfigId)),
'Variation1',
'Variation Description',                         'Variation Description');

 

productDimensions = EcoResProductVariantDimValue::getDimensionValuesContainer(
"Variation1");

//Create Product search name
ecoResDistinctProductVariant.DisplayProductNumber = EcoResProductNumberBuilderVariant::buildFromProductNumberAndDimensions(
EcoResProductNumber,
productDimensions);

//Create Product variant with Product and dimensions provided
ecoResDistinctProductVariant = EcoResProductVariantManager::findDistinctProductVariant(EcoResProduct.RecId, productDimensions);
if (!ecoResDistinctProductVariant)
{
ecoResDistinctProductVariantRecId = EcoResProductVariantManager::createProductVariant(EcoResProduct.RecId,
ecoResDistinctProductVariant.DisplayProductNumber,
productDimensions);
}
else
{
ecoResDistinctProductVariantRecId = ecoResDistinctProductVariant.RecId;
}

//Find newly created Product Variant
ecoResDistinctProductVariant = ecoResDistinctProductVariant::find(ecoResDistinctProductVariantRecId);
//Now release the Product
if (!EcoResProduct.isReleased())
{
EcoResProductReleaseManagerBase::releaseProduct(EcoResProduct.RecId, CompanyInfo::findDataArea(
curext()).RecId);

info("Product released");

}

//Now release the Product Variant
if (ecoResDistinctProductVariant && !ecoResDistinctProductVariant.isReleased())
{
releaseManager = EcoResProductReleaseManagerBase::newFromProduct(ecoResDistinctProductVariant);
releaseManager.release();
info(
"Product variante released");
}

I have to say that the customer delivered a CSV file in which I had a record for each configuration but the rest of the fields where repeated like the product number, product search name, prices, ... and so on. Only was new the configuration name and the configuration description. But that was not a problem as for each record the data import/export framework would create or update (if already exists the product master) but also add the new configuration information for each product master. Only I had to modify the index of the DMFProductMasterEntity so I could load has staging data repeated product master numbers but with diferent configuration...

Ok that is for now...on the next days I have lot to do with the data import/export framework...maybe I can find something interesting to write...now I finish just on time to watch the great football match of the year here in Germany. The final of the Europa League between Dortmund and Bayer. Sadly the spanish teams were seriously beaten by the germans :-(

Sunday 28 April 2013

AIF - Add operation update to LedgerGeneralJournal standard Document Service

Hi all, after 2 weeks vacations in Spain and 1 week course in Prague (Dynamics Ax Perfomance Workshop) I had really no time to write something new. So here we are.

Some month ago I had used the standard AIF Document Service for Ledger Journal in order to create new journals throught AIF. Everything worked perfect but last week I needed to improve this so I could be able to add lines to a journal that already exists throught AIF.

Everything looked ok but I realiced that the operation "update" was no activated by default and making it available have some tricky things that I want to share here.

First. We have to update the LedgerGeneralJournalService using the "Update document service" tool adding the methods "update" and "findKeys" and selecting the regeneration of the supporting classes.

 
Second. It is recomendable also to modify the Query AxdLedgerGeneralJournal establishing the property Update in the datasource LedgerJournalTable with value "Yes" so we allow the update on the datasource provided by the query.
 
Third. It is possible that we need to update the method "updateList" of the class AxdLedgerGeneralJournal. On default the update is not implemented so we would receive a message "operation not implemented", adding instead of the error message the code below would allow the update.
 
 
 
Fourth. One of the tricky things of the XML Message that we need to format for updating throught AIF is that we need to provide a DocumentHash of the document that we want to update. This value we could get if we realice a read operation with AIF but in my case I wanted to get the DocumentHash value without the need to call the read operation. In this link it is explained how we could do that:
 

 
And last we create the XML Message with the appropiate format for the Document Service. Where we specify the EntityKey which will be used to look for the JournalNum we want to update, the DocumentHash for the document and we specify that we want to update the entity LedgerJournalTable but create the entities LedgerJournalTrans (we want to add lines to an existing journal)
 
 
 These are the steps required to add lines to an existing journal throught AIF, something relatively easy but with small things that could make you lose a lot of time until you get it working. More info about updating data with AIF we can find here:
 
 
As always I finish with a wonderful photo. This time from the lovely city of Prague where I could attend the Dynamics Ax Perfomance Workshop :-)
 
 

Sunday 10 March 2013

Microsoft at CeBIT 2013 (Hannover)

Hi again! Last saturday I visited for first time the trade show CeBIT in Hannover (Germany) and I want to give my short impressions about what I have seem there.

Our company MODUS Consult AG was there as Microsoft Partner and I hope you where there to look for our services with Microsoft Dynamics Ax, Dynamics Nav, Qlikview BI or ELO Document Management Software between others.

Lots of germans companies are using or moving to Microsoft Dynamics ERP products (Nav/Ax). In fact Dynamics Ax 2012 is already finally considered a real alternative to SAP by bigger german companies and not only for the middle sized.

For my experience german companies are very cautious with their ERP selection so this demostrate the confidence of the market in the Microsoft ERP products.


But in this post I am not going to talk to much about about Microsoft Dynamics Ax but on the surrounding products that Microsoft have recently delivered or is going to deliver. That is Windows 8, Windows Mobile 8, Microsoft Office 365, Microsoft Sharepoint 2013 and Windows Azure.

Much have been written about the decadence of Microsoft in the Internet/Mobile/Cloud era but Microsoft have reacted quite well to this wave that was comming. Microsoft have reacted and now is time, for us, as professionals to react as well and use this technology that Microsoft put in our hands.

Windows 8

From my experience as IT Manager in Spain where I had to make the most with the less money possible I am not what we should call a technology geek. I like to use something that works the most time possible so I can save my company's money and time. So I was not really convinced about Windows 8 not even the version Windows RT.

Using the new Windows 8 in the Microsoft Surface Pro tablet is really a good experience and the keyboard is simply cool. I could imagine myself working in the train ICE to Berlin with this tablet. I actually own a Galaxy S Tablet and Microsoft Surface could be used both for working and pleasure and not only the last as I do with my actual tablet.

About Windows RT version tablet there is another story. It is cheaper but the operating system is quite limited. You can only install applications that come from the Windows Store and not normal Windows applications. That could be a limit. But on one of the sessions I have seem a really good use that this tablet can provide.

The fact that you can not install to much applications can make this device more secure and with the use of VDI (Microsoft Virtual Desktop Infraestructure) companies can provide a secure way to provide mobility to their workers. Users can use their normal applications on mobility but as nothing is installed in the device the security is less compromissed. This could be a second life to a device that seemed to be dying before being delivered to market.

 Windows Mobile 8

I am also a mobile developer since the J2ME era and I have developed some applications for general J2ME devices, Blackberry devices and lastly some general sites with JQuery Mobile. So I like to keep up to date with the last mobile technologies. I actually also have a Galaxy S android device and I was also not very much convinced about Windows Mobile 8 before playing with it. I had the opportunity to play with the latest models in the trade show and I was finaly convinced. It is a cool operating system and as I like to develop with .NET so I found this a really good platform. Blackberry new Z10 was also available on the trade show but it not attracted not too much attention so the mobile market remains between the 3 players Apple, Google and Microsoft. On the way Microsoft gave a second opportunity to Nokia.

 Microsoft Office 365

We are heading to the Cloud era and Office 365 is one of the steps of Microsoft in this area. Sooner or later almost everything would be on the Cloud so better that we start getting used to this. It is very cool to have your office documents available to work with from whatever device you have and from whatever place. One of the demostrations I could take place was a video conference between 2 tables using Lync and sharing and working with a sharepoint presentation.

You could try by yourself at Microsoft Office Website.

If you are still not prepared for the cloud Microsoft have released the Microsoft Office 2013 as installables as well.

Windows Azure

 I had the opportunity to attend a MSDN session about developing mobile application with Windows Azure. The cloud solution from Microsoft and I found really cool as easy you can create a new mobile applications on the cloud, setup its database and start coding it on Visual Studio to start producing functionality to your users. We are heading to the cloud world and even probably next versions of Dynamics Ax would be cloud based so better that we get used to the technology. Microsoft lets you try the technology for free and start coding in the cloud.  Here you can see an example video of how creating mobile applications with Windows Azure.

Developing Mobile Solutions with Windows Azure Part I (MSDN Channel9 video)

 Definitedly we are heading to the Cloud but for critical software as an ERP we have still questions to solve like where the data is, security, how I could recover my data if the provider goes bankrupt, etc.

Microsoft Sharepoint 2013

And last something about Sharepoint 2013. Also looks great. Microsoft have continuosly improved its Sharepoint products. I have worked with versions 2007 and 2010 and between both I have seem an improvement and this time Microsoft has also delivered a really good worked product. New features would be a more social intranet, a better workflow module and better mobility providing menues really touch oriented.


As we seen Microsoft have reacted energically in all the fronts with new products. We have also not mentioned SQL Server 2012 and Windows Server 2012. Systems in which we are already working with our last Dynamics Ax 2012 R2 solutions. So there is a new of technology available and now we should start working and apply this technology in our companies or customers to start helping them to move to the new IT era that is just now here where information but also processes are not just on your computer but everywhere. It is indeed a very interesting time as IT professional.


That was all for today! For next time I have something really interesting on mind. Since some time I wanted to develop mobile solutions for Dynamics Ax using JQuery Mobile so what I will do is to create a small project using Microsoft Dynamics Ax 2012, ASP.NET MVC 4 and JQuery Mobile. This will take some time and I will create a CodePlex project so everyone can take a look at the project and I will create several posts as the project advances so you will be able to see how this solution advances from concept to implementation :-)

Last a photo I took also on CeBIT about of one of the reasons one can have to move to germany. Where else you can drive a BMW at 200km/h (124mi/h) on the highway? :-) 

And believe me, it is not dangerous and quite pleasant!




 













Saturday 2 March 2013

DMF - Create a custom entity for importing ZipCodes

Data Migration Framework. Create a custom entity for importing ZipCodes

Hi again! This is the second post on my new blog about Dynamics Ax and as promised I will deliver something interesting. We will speak about the Data Migration Framework and in particular how we can import ZipCodes and its relate information (States, Counties and Cities) creating a custom entity for that purpose.

But first of all lets speak something about Germany. I moved here so I think some of you would find interesting my life and experiences here.

This is my first winter in Germany and although it was not very cold for what I expected it was in fact the darker winter in 43 years!

[Der Spiegel - Dark winter article ]

Basically I haven't seen the sun in 5 Months. But today it was pretty sunny! That I have to say.

Anyway, the spring is comming and I can now say that I survived the german winter. This is quite a thing for someone that comes from south Spain :-)  you only have to look below at the end of this post to know what I am talking about :-)

So let's speak about the Data Migration Framework.

As you should know the Data Migration Framework was released for Microsoft Dynamics Ax 2012 (Actually in version Beta 2.0 and working also with SQL Server 2012 and Ax2012 R2) as a tool to help with the migration of data into Microsoft Dynamics Ax.


After looking how it works and how it looks like, we in our company have decided to use it in order to standardize our migration procedures and deliver better and more predictave results in our data migration phases.

Basically what the framework does is to take data from a data source (for example a CSV file) and copy the data on a staging table before the data is migrated to the final target table. We can make a mapping from source to staging and also a mapping between staging and target. Between staging and target we can also use functions that would be executed to convert the values or even execute some needed logic before the value is copied to target.


You can download the framework from the Information Source WebSite and there you will find also some documentation and even a ilustrative video about how to use the framework.

Installing the framework should be not problematic following the instructions of the documentation and with Beta 2.0 and Dynamics Ax 2012 R2 there is a small issue in which one of the classes of the framework tries to use a table with name CreditCardCustNumber that is not anymore in this version of Ax.

In my case I just commented the code that use this table because it was not relevant for me but in your case you will have to check in you come to this problem.

After installing the framework you will be able to start using it with some of the entities that are already delivered like customers, vendors, customer addresses, vendor addresses, open sales orders, ... you can find the entire list in the documentation or in the Target entities form in the Data Migration Framework Area Page.



In our case we want to import ZipCodes and we dont have an entity for that so we will have to create our own entity to create records in the table LogisticsAddressZipCode but also in LogisticsAddressState, LogisticsAddressCounty and LogisticsAddresssCity in case we have to.

To make that we will use the provided Wizard to create the needed AOT Objects. So we use the option Create a custom entity for migration. This will launch the wizard.



We are asked to which table we want to import data. We select LogisticsAddressZipCode.



On the next step we are presented the names of the entity table, query and class objects that will be created. We can change it but we leave as proposed.



The result is a project with all the objects created. But we still have some work to do because we want also to create records on the tables LogisticsAddressState, LogisticsAddressCounty and LogisticsAddresssCity. For that we will create methods that will be executed when the system takes the values of State, County and City from Staging table to Target table. These methods will check if the values of State, County or City exists. If not a record would be created on the corresponding table.



For each method we will check if the corresponding record exists and if not we will create the record. At the end we will return a container which contains the values that we want to be available for the target table.



We have also to modify the method getReturnFields of our entity class so we indicate for each method which value of the target table will be initialized with the result of the execution of the function.



We need also to modify the entity table and create a field group per each function we have created with the same name of the function. If not we would be not able to use our functions on the modify entity mapping form later.



On the Target entities form I can setup now my new target entity.



And we can modify the fields mapping between staging table and target table clicking on the button Modify target mapping. Below the Mapping details view where we can specify the detail of the mapping.



And below the Mapping visualization. There we can see that our new methods are used between staging and target. Looks cool eh?



Now we will have only to create a Processing group which uses our new target entity for ZipCodes and map a source file like the one you can see below to the staging table and we would be able to import, first the data to the staging table and later to the target table.




So this is an example of what is possible with Data Migration Framework. Which is still on version Beta 2.0 but seems really useful to create a quite professional data migration procedure for our ax projects.

To end up with this post I give you all some sun from Spain, photo taken in Aguilas (Murcia), place where I used to spend my free time before deciding to come to Germany. Nice isn't? :-)




Next week is celebrated here in Germany the biggest IT trade show in Europe CEBIT. I hope I have time to drive there and maybe I can deliver another post about my visit there ;-)

So see you all until the next time!