Monday, 25 June 2007

Reporting Services on Dynamics NAV

 
When using Reporting Services with Dynamics NAV there are a number of things to take into consideration.
 
One of the main things to consider is the way the SQL Tables are structured with Dynamics NAV. For each company in the database there will be a set of company specific tables. These are prefixed Company Name$. So for example the Customer table for a company called CRONUS International Ltd. is CRONUS International Ltd_$Customer.  Note the '.' at the end of the company name has been replaced with an '_'
This is based off of a setting in NAV that defines which special characters are replaced, and is done to make life simpler in SQL.
 
This means that when writing reports you need to know the company name of the NAV Company you want to report on. Where there is only 1 Company in the database this isn't to painful, however as a developer it means every time you want re-use a report into a different company you need to go back in and change the Company name part of each table. It also means that, assuming you are going to deploy your reports to a test environment prior to a live deployment' that your NAV Company in the test system has to be the same as in your live one, a practice that can cause other confusions when using the NAV Client.
 
To get round this problem you can do one of 2 things.
  1. You can generate a SQL View to present the data you want to report on and leave the company name out. This at least makes the reports cross company compatible as you only have to build the views once for each company.
  2. If you are basing your Reports on data prepared by stored procedures you also have the option of using dynamic SQL and passing the Company Name as a Parameter at run time.
Both of these options also are fine where you have multiple companies in the database as you can add an identifier to the SQL View and use a Parameter in the report to filter for the correct data.
However, as with a project I am currently working on, when you have multiple companies and the prospect of more being added and some being closed out on a regular basis the prospect of keeping the SQL Views up to date can force you to use the second option only, i.e. use dynamic SQL and pass the company name as a parameter.
 
Another thing to consider is that not every field you can see in the NAV client is actually held on the underlying SQL Tables. NAV flowfields are one such example. Form experience this can make what appears to be a simple report very complicated.

Reporting Services Overview

Reporting Services is a part of Microsoft SQL Services. It first appeared as an optional download for SQL2000 and is a standard option for SQL2005.

As its name implies it is a reporting tool. More specifically its a web based reporting tool built on SQL but that allows reporting on any data source that you can create an ODBC connection too.

Predominantly I use SQL databases as the data source, although I have also built reports using Analysis services as the data source.

The principle is very straight forward. Using Visual Studio 2005 you create a new report. Link it to a data source, either defined specifically in the report or by using an external shares link.

You can then write SQL queries against the data in a series of data sets. You can also set a data set to execute stored procedures on the SQL database. This allows for very complex queries to be written.

Within the report you have access to a number of different controls, including graphing and pivot table style controls.

You can specify that some controls are dependant on other controls and only appear when the parent is selected.

Another useful tool is the principle of 'drill through' reports. These are effectively extra reports that are called when part of a report is selected. All appropriate parameters are passed down to the new report. This allows you to write summary reports with the drill through allowing you to have details when needed.

Another feature of Reporting services is the ability to subscribe to reports. This allows a user to pick a report from the report server and specify when they would like to automatically get it. They can also specify the format the report should be delivered in, e.g. pdf, Excel Spreadsheet, tiff etc. The can set what parameters the report should use and where it should be delivered to. This allows a sales manager, for example, to have the previous days sales results delivered at 8am to his email as an excel spreadsheet.


One final tool that comes with some of the versions of Reporting Services is Report Builder. This is a web deployed report designer. It requires that a data model has been built in advance but once that has been done then an end user can use the Report Builder to build ad hoc reports on the data source. These could be one off reports or if the user decides that the report will be useful to other users they can deploy the report back to the report server where it becomes available to all users.

The Report Library itself is either accessed via its own web site or it can be linked to Microsoft SharePoint. Where it is linked the reports appear in their own library inside the SharePoint portal and can be used in the same was as from the web site.

Saturday, 23 June 2007

Jobs

Under NAV 5.0 the Jobs modules has been fairly heavily rebuilt.

I'll summarise here and try to expand in the future.


For a long time it has been felt that the jobs module in NAV needed to be overhauled. In NAV 5.0 Microsoft has done just that.

Gone are the old concepts of Phases, Tasks and Steps.

Gone are Job budgets.

Now we have Job Task Lines and Job Planning Lines.

Each job has Job Task Lines. These are built up in a similar way to G/L Accounts. Lines are set to be heading, posting, begin and end totals or totalling in the same way as G/L accounts.

The structure also mirrors the structure of MS Project and while there is no integration out of the box right now, its something I think could be done fairly easily, when I get a bit of time I'll give it a go.

You can also specify different Job Posting Groups against each Task Line which allows for better division of WIP and Job Recognition.

Each Task Line has associated Planning Lines. Planning lines are similar to the old job budgets. However the do slightly more than the old budgets. Each Planning Line can be defined as Schedule or Contract or both. At the simplest level Schedule is the expected costs and profits, just like the old budgets. Contract defines the amount you are allowed to on charge to the customer of the Job.

Whenever you post an entry to a Job you have to select if it is Contact, Schedule, Both or neither. You also have to select the Job Task that the entry is against.

Once you start posting Job entries you can create sales invoices. Again this is much easier than under Version 4 and earlier. Now all you have to do is go to the Job Planning lines, or Job task lines, and use a function 'Create Sales Invoice'.

Other enhancements to Jobs include more control around how WIP is calculated and posted, including a new Tab on the Job card that gives a breakdown of the current WIP and Recognition.

Friday, 22 June 2007

Navision Rapid Implementation Methodology

So the RIM tool kit for NAV has been around for a while now, but I have just started a project that will use this with Version 5.0

With NAV 5.0 when you create a new company you get prompted to say what type of company it is. The list of 'out of the box' configurations is fairly limited with options for Food, Furniture, High Tech, Machinery and Wholesale.

Selecting one of these automatically set's up master data, which is all configured in XML files hidden away in the Client install folder.

That said its fairly straight forward to create new configurations.

  1. Create a new XML file for the configuration you want

  2. Modify the Configuration.xml file to reference the new file.
Once you have your nice new company all set up and the configurations from the XML file have all been loaded its time to move onto the specific data for this company.

Again the RIM tool makes this easy, and if you build your XML configuration file correctly you will even have the RIM Templates setup and ready to go, if you don't have the templates setup for you then you will need to create them but this is also straight forward.

The setup templates fall into 2 categories, Setup Configuration(handled by the Setup Questionnaire) and Data Migration(handled by the Data Migration).

The Setup questionnaires take a little bit of thought initially. You can have multiple setup questionnaires that each have multiple Setup tables associated with them. Looking at it I think you should always try to only use 1 Questionnaire, or ensure that each Setup table is only referred too once or you could end up with some confusion.

Once you have your Questionnaire you need to go to the actual questions. These belong to 'Question Areas'. These are the Setup tables. Give your Question area a Code and description and then associate it with a table. You can then use the function Update Questions. This will populate the question lines with all the fields from the table. You can then go through modifying the questions for each line so that it makes a bit of sense to a user. You can add comments or define initial answers. After you have your Question Area completed move onto the next one.

Once all your Question Areas are designed you go back to the main Questionnaire screen and user either Export to XML or Export to Excel. So far I have only used the Excel option as I intend to give the file to the user to complete. When the excel file is filled in you can re-import it and the answers will be loaded to the Question Area Screens where you can review them and then Apply them, this will update the Setup tables... Easy huh?

For the Data Migration the process is similar. This time there is no Questionnaire to worry about. You just define the table that you want to migrate data to and then select Migration Fields from the Migration button menu. In here you define the fields that will be imported. When you open the screen you will get a list of all the fields and you just need to flag those that will be included.

Once you have this setup you can select the various pieces of master data to create files for and use the functions to Export to XML or Export to Excel. Again I am using the Excel option. If you select multiple tables you get a separate tab in the resultant file for each table. If any of the tables already have data on them this is included in the exported file, useful for that data that was setup in the config XML file earlier where you want the customer to review the defaults and amend them.

This is given to the users and once filled in you can import the data to NAV.


The final part of the Toolkit allows for Templates to be built. These again refer to specific tables.
You can define multiple Templates for each table if required. In the Template you can define which fields are Mandatory and what default values they should have. The Templates can be used as part of the Data Migration to complete data the user may not know about, but they can also be used after the migration.

As an example we have a number of templates for the customer Table. Each defines slightly different posting groups and payment terms. From the Template you can Create an Instance. This will create a new record on the Customer table and pre-fill all the information form the template. alternatively from the customer you can run a function Apply Template. This give yo a list of templates for that table and when you select one it updates the current record with the defaults from the Template.

With a little bit of work the Template's could be made more useful still, especially in environments where new master data, e.g. Customers and Items, are regularly setup as it would reduce the amount of keying of default data while still allowing flexibility.