Thursday, 20 December 2007

Microsoft CRM4.0 "Titan" Released

 

REDMOND, Wash. — Dec. 17, 2007 —The new version of Microsoft Dynamics CRM, formerly code-named “Titan,” has been completed and released to manufacturing, Microsoft Corp. announced today. The new version is offered under two product names: Microsoft Dynamics CRM 4.0 for on-premise and partner-hosted deployments and Microsoft Dynamics CRM Live for Microsoft-hosted deployment. Designed with a single unified-code base for both on-premise and on-demand deployments, Microsoft Dynamics CRM enables customers to choose the right deployment model for their specific business and IT needs, with the flexibility to change deployment models over time if their needs or preferences change.

 

This is good news for me as I am leading a project that is using CRM4.0 as a major component.

 

Read the full release here.

Wednesday, 24 October 2007

NAV goes WSS3.0

 

Finally Microsoft have released an update for NAV that allows it to use WSS3.0 for Employee Portal instead of WSS2.0. This has been much requested and looked forward too, especially in our office were we are doing more and more work involving WSS3 and MOSS.

If you are a MBS partner you can download the update here https://mbs.microsoft.com/partnersource/downloads/releases/EmployeePortalNAV50.htm

Tuesday, 11 September 2007

Reporting Services, Hierarchy's and cross Browser Problems

This week I have been doing some Report development for a custom application we have been developing at work. Given everything is SQL based we opted to build the reports using Reporting Services.

On the whole the reports are pretty straight forward. However there were 2 challenges.

The first is around hierarchical reporting. With in the application people can belong to different Occupational Units's(OU). One of the business rules for the reporting is 'A user can only report on their own OU and any OU that is below them.

To achieve this I ended up building a recursive function that returns a Scalar Table that works out all the OU's a specific user can see. I then use that in my where clause.

The Table Function looks like this

CREATE FUNCTION [Generate_OUTree] 
(
-- Add the parameters for the function here
@OUID [uniqueidentifier]
)
RETURNS @OUTree TABLE
(
OUID [uniqueidentifier],
ParentID [uniqueidentifier],
OUName Varchar(500),
Level Int
)
AS
begin

-- Add the SELECT statement with parameter references here
With OUTree (OUID, ParentID, OUName, Level)
As
(
-- Anchor Member defination to get top level OU
Select OUID, ParentOUID, OUName, 0 as Level
From CustomerOUView
Where OUID = @OUID
UNION ALL
--Recursive member defination to get child companies
Select CCV.OUID, CCV.ParentOUID, CCV.OUName, Level + 1
From CustomerOUView CCV
Inner Join OUTree
On CCV.ParentOUID = OUTree.OUID
)

insert into @OUTree
Select * From OUTree
RETURN

As you can see I pass in a parameter that is the ID of the users OU.






 






To actually use this to achive what I want I am calling he function like this






 







    Select *
From dbo.staffView
Where OUID IN(Select OUID From dbo.Generate_OUTree(@OUID))







This effectively returns all the rows from my view where the OUID in the view is in the list generated by my function.






 






The follow on to this was we had to build a full OU tree report. The report shows each OU under its parent and indents teh list so it is wasy to see which child OU belongs to Which parent.






 






The hard part was to be able to sort the report so that the OU structure was returned in a format that made it easy to group the OU's together.






 


CREATE PROCEDURE [Report_GenerateOUList]
-- Add the parameters for the stored procedure here
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
With OUTree (OUID, ParentID, OUName, Level, Sort)
As
(
-- Anchor Member defination to get top level OU
Select OUID, ParentOUID, OUName, 0 as Level ,Cast(OUName as Varchar(4000))
,OperatorCategoryName,MembershipCategoryName
From CustomerOUView
Where ParentOUID = '00000000-0000-0000-0000-000000000000' --top level OU has no parent


        UNION ALL
--Recursive member defination to get child OU's Select CCV.OUID, CCV.ParentOUID, CCV.OUName,
Level + 1, Cast(Sort + '|' + CCV.OUName as Varchar(4000))


        From CustomerOUView CCV
Inner Join OUTree
On CCV.ParentOUID = OUTree.OUID
)
Select * From OUTree
Order by Sort





In the database the OU ID's are all GUID's and as I wanted all OU's displayed in this report I have hard-coded for the top level OU.






Inside Reporting Services I am using the following to get indentation






=(Space(Fields!Level * 5) + Fields!OUName.Value)






This then causes 5 spaces for each level down the tree to be inserted in front of the OU's Name.






 






Once I had all this going and deployed to my Report Server everything appeared to be working fine when I viewed the reports with IE7. The only remaining thing to do was do a second check on the reports using Firefox.






Firefox and Reporting Services have a few issues. Due to he way RS renders the reports in IFRAME's Firefox causes them to get squashed up and not display correctly. After some frantic searching, Google is your friend smile_wink, and a bit of testing on a couple of report servers  found the following.






First-up Firefox only shows the top 2 to 3 cm's of your report. The rendering puts it a small IFRAME instead of expanding out down your 'page' the way IE does. This is fairly easy to resolve, thanks to Jon Galloway's blog for this answer. You need to edit the ReportingServices.css. On my machine it lives here "C:\Program Files\Microsoft SQL Server\MSSQL.3\Reporting Services\ReportManager\Styles\"






In the css file you need to add the following entry.



/* Fix report IFRAME height for Firefox */
.DocMapAndReportFrame
{
min-height: 860px;
}





 






This forces Firefox to use a taller IFRAME.






 






I also found that the width of the report was getting squashed up on my deployment server, although on my laptop's report server it wasn't happening. On investigating the two machines the only thing I could find was a difference in the build versions of the 2 SQL servers. The Deployment server is using SQL2005 SP1 while my laptop is SQL2005 SP2.






 






As a work around I found that by putting an empty text box in the header of my reports that was the full width of the report Firefox no longer squashes up the report. While ideally I will update the server with the latest SP for SQL this is acting as a work around for the time being.






 






Tuesday, 4 September 2007

Dynamics Book

Maybe I have been a bit slow. Or maybe I just wasn't paying attention to the right things but I just found a useful site that is being put together by the Dynamics community. So far the vast majority of the work appears to have been done by David Singleton, one of the MVP's for Dynamics NAV.

It's called the Dynamics Book and its a wiki for all the Dynamics Products. Understandably at this stage the majority of the information is around Dynamics NAV but hopefully with the wider community getting in on the act that will grow and it will become a good source for all Dynamics products.

This in conjunction with the forums, MiBuSo and Dynamics User  that many Dynamics Users and Consultants regularly contribute too are an invaluable resource for anyone working in this space, be you an end user, new consultant or developer, or experienced consultant or developer. There's always a new trick to learn. Check them out.

I'll be adding permanent links to my tool box for all of them.

Saturday, 28 July 2007

Document Approval in Employee Portal

In NAV 5.0 there is now standard document approval functionality.

While it is fairly basic in some respects, it is more than adequate for a smallish company to handle purchase and sales approvals.

A request I had with one of my clients was for the document approvers to check and approve the documents on a website as these approvers would have very little reason to access NAV for any other reason.

I opted to deploy the approvals via SharePoint using Employee Portal.

 

Initially I figured this would be fairly straight forward. All I needed to do was set up a couple of web parts in NAV that displayed the Document approvals data, pre filter it so that only approvals for the user that had logged in would be displayed. Provide a link that would display the associated Purchase document. I'd need an option box adding that allowed the user to select Approve, Reject or Delegate.

All pretty simple stuff really. So off I went, I added the Option field to a table, set up a list web part for the Document Approvals, I also set up a Card part for this table, the logic being I'd display a list of all approvals with minimal information and then the user would access the card part to get more information and to select the approve, reject, delegate options. In the card part I added a save button. 

Now came the tricky bit. I needed to update NAV when the user made a selection.

In standard NAV there the only tables that can be updated by Employee Portal are the ones related to the demo web parts, i.e. Customers, Sales Headers and Lines, BOMs and Items. If you want to get anything else back in you have to go and make changes to at least one codeunit. Depending on what you are trying to do you need to work on codeunit's 6822 through 6827. For what I wanted to achieve I had to add extra functions to 6824 - Modify Head Data mgt.

What I had to do was add a new function that deciphered the incoming XML, extract the fields I was interested in and then execute the standard Approve, Reject, Delegate Functions.

This took a while because I used a principle I had used many times before when building new code. That is I build it up in small steps checking I am getting the result I want at each step. With the Employee Portal bits what I failed to realise was that I had to initialise a whole heap of variables that get passed out of the code I was working on so that the webpart updates after my code executes. As I hadn't added this code I kept getting error after error in the portal site. Given that every time you make a change you have to stop and start the NAS so that it gets the new code and you can't access the standard Debugger to see where it is failing this became a fairly long process of insert an error command in the code. Restart the NAS, reload the approval document, save it and see what happened, i.e. did I hit my deliberate error before hitting the piece of code that was failing. Once I found the point where I had the line of code that was giving a problem it was straight forward to resolve, finding that line took me about 4 hours.

Once I understood where in my 'normal' style of development I had gone wrong it took me another couple of hours to get the functions I wanted to work. I did run into 2 other problems with it all.

The first was that when i update the record that the Portal has gotten I am getting a 'Another user has modified this record since it was retrieved from the database' error. I can almost live with that, I am going to attempt to set it up so that the portal resets and doesn't attempt to display the updated record anyway.

The second proved a lot more problematic. After getting the web base approvals set for one company I am in the position where I need to roll it out over several companies for the same client. What I don't want to do is have multiple NAS's running as the client is going to be creating new companies and removing old ones from time to time, so I want to run the Approval functionality cross multi companies. after much thought we decided to make a new approval table that is updated when approvals are created and modified, This table also captures the company name of the NAV company that the approval was raised in. We then use this to drive the Approval Portal.

This gave us a lot of headaches as when something is approved in the Portal it needs to update the correct company in NAV, but some clever coding has gotten round this. There is a fair amount of other multi company functionality required by this customer so I'll do a separate post about some of those challenges once I have them figured out.

Monday, 25 June 2007

Reporting Services on Dynamics NAV

 
When using Reporting Services with Dynamics NAV there are a number of things to take into consideration.
 
One of the main things to consider is the way the SQL Tables are structured with Dynamics NAV. For each company in the database there will be a set of company specific tables. These are prefixed Company Name$. So for example the Customer table for a company called CRONUS International Ltd. is CRONUS International Ltd_$Customer.  Note the '.' at the end of the company name has been replaced with an '_'
This is based off of a setting in NAV that defines which special characters are replaced, and is done to make life simpler in SQL.
 
This means that when writing reports you need to know the company name of the NAV Company you want to report on. Where there is only 1 Company in the database this isn't to painful, however as a developer it means every time you want re-use a report into a different company you need to go back in and change the Company name part of each table. It also means that, assuming you are going to deploy your reports to a test environment prior to a live deployment' that your NAV Company in the test system has to be the same as in your live one, a practice that can cause other confusions when using the NAV Client.
 
To get round this problem you can do one of 2 things.
  1. You can generate a SQL View to present the data you want to report on and leave the company name out. This at least makes the reports cross company compatible as you only have to build the views once for each company.
  2. If you are basing your Reports on data prepared by stored procedures you also have the option of using dynamic SQL and passing the Company Name as a Parameter at run time.
Both of these options also are fine where you have multiple companies in the database as you can add an identifier to the SQL View and use a Parameter in the report to filter for the correct data.
However, as with a project I am currently working on, when you have multiple companies and the prospect of more being added and some being closed out on a regular basis the prospect of keeping the SQL Views up to date can force you to use the second option only, i.e. use dynamic SQL and pass the company name as a parameter.
 
Another thing to consider is that not every field you can see in the NAV client is actually held on the underlying SQL Tables. NAV flowfields are one such example. Form experience this can make what appears to be a simple report very complicated.

Reporting Services Overview

Reporting Services is a part of Microsoft SQL Services. It first appeared as an optional download for SQL2000 and is a standard option for SQL2005.

As its name implies it is a reporting tool. More specifically its a web based reporting tool built on SQL but that allows reporting on any data source that you can create an ODBC connection too.

Predominantly I use SQL databases as the data source, although I have also built reports using Analysis services as the data source.

The principle is very straight forward. Using Visual Studio 2005 you create a new report. Link it to a data source, either defined specifically in the report or by using an external shares link.

You can then write SQL queries against the data in a series of data sets. You can also set a data set to execute stored procedures on the SQL database. This allows for very complex queries to be written.

Within the report you have access to a number of different controls, including graphing and pivot table style controls.

You can specify that some controls are dependant on other controls and only appear when the parent is selected.

Another useful tool is the principle of 'drill through' reports. These are effectively extra reports that are called when part of a report is selected. All appropriate parameters are passed down to the new report. This allows you to write summary reports with the drill through allowing you to have details when needed.

Another feature of Reporting services is the ability to subscribe to reports. This allows a user to pick a report from the report server and specify when they would like to automatically get it. They can also specify the format the report should be delivered in, e.g. pdf, Excel Spreadsheet, tiff etc. The can set what parameters the report should use and where it should be delivered to. This allows a sales manager, for example, to have the previous days sales results delivered at 8am to his email as an excel spreadsheet.


One final tool that comes with some of the versions of Reporting Services is Report Builder. This is a web deployed report designer. It requires that a data model has been built in advance but once that has been done then an end user can use the Report Builder to build ad hoc reports on the data source. These could be one off reports or if the user decides that the report will be useful to other users they can deploy the report back to the report server where it becomes available to all users.

The Report Library itself is either accessed via its own web site or it can be linked to Microsoft SharePoint. Where it is linked the reports appear in their own library inside the SharePoint portal and can be used in the same was as from the web site.

Saturday, 23 June 2007

Jobs

Under NAV 5.0 the Jobs modules has been fairly heavily rebuilt.

I'll summarise here and try to expand in the future.


For a long time it has been felt that the jobs module in NAV needed to be overhauled. In NAV 5.0 Microsoft has done just that.

Gone are the old concepts of Phases, Tasks and Steps.

Gone are Job budgets.

Now we have Job Task Lines and Job Planning Lines.

Each job has Job Task Lines. These are built up in a similar way to G/L Accounts. Lines are set to be heading, posting, begin and end totals or totalling in the same way as G/L accounts.

The structure also mirrors the structure of MS Project and while there is no integration out of the box right now, its something I think could be done fairly easily, when I get a bit of time I'll give it a go.

You can also specify different Job Posting Groups against each Task Line which allows for better division of WIP and Job Recognition.

Each Task Line has associated Planning Lines. Planning lines are similar to the old job budgets. However the do slightly more than the old budgets. Each Planning Line can be defined as Schedule or Contract or both. At the simplest level Schedule is the expected costs and profits, just like the old budgets. Contract defines the amount you are allowed to on charge to the customer of the Job.

Whenever you post an entry to a Job you have to select if it is Contact, Schedule, Both or neither. You also have to select the Job Task that the entry is against.

Once you start posting Job entries you can create sales invoices. Again this is much easier than under Version 4 and earlier. Now all you have to do is go to the Job Planning lines, or Job task lines, and use a function 'Create Sales Invoice'.

Other enhancements to Jobs include more control around how WIP is calculated and posted, including a new Tab on the Job card that gives a breakdown of the current WIP and Recognition.

Friday, 22 June 2007

Navision Rapid Implementation Methodology

So the RIM tool kit for NAV has been around for a while now, but I have just started a project that will use this with Version 5.0

With NAV 5.0 when you create a new company you get prompted to say what type of company it is. The list of 'out of the box' configurations is fairly limited with options for Food, Furniture, High Tech, Machinery and Wholesale.

Selecting one of these automatically set's up master data, which is all configured in XML files hidden away in the Client install folder.

That said its fairly straight forward to create new configurations.

  1. Create a new XML file for the configuration you want

  2. Modify the Configuration.xml file to reference the new file.
Once you have your nice new company all set up and the configurations from the XML file have all been loaded its time to move onto the specific data for this company.

Again the RIM tool makes this easy, and if you build your XML configuration file correctly you will even have the RIM Templates setup and ready to go, if you don't have the templates setup for you then you will need to create them but this is also straight forward.

The setup templates fall into 2 categories, Setup Configuration(handled by the Setup Questionnaire) and Data Migration(handled by the Data Migration).

The Setup questionnaires take a little bit of thought initially. You can have multiple setup questionnaires that each have multiple Setup tables associated with them. Looking at it I think you should always try to only use 1 Questionnaire, or ensure that each Setup table is only referred too once or you could end up with some confusion.

Once you have your Questionnaire you need to go to the actual questions. These belong to 'Question Areas'. These are the Setup tables. Give your Question area a Code and description and then associate it with a table. You can then use the function Update Questions. This will populate the question lines with all the fields from the table. You can then go through modifying the questions for each line so that it makes a bit of sense to a user. You can add comments or define initial answers. After you have your Question Area completed move onto the next one.

Once all your Question Areas are designed you go back to the main Questionnaire screen and user either Export to XML or Export to Excel. So far I have only used the Excel option as I intend to give the file to the user to complete. When the excel file is filled in you can re-import it and the answers will be loaded to the Question Area Screens where you can review them and then Apply them, this will update the Setup tables... Easy huh?

For the Data Migration the process is similar. This time there is no Questionnaire to worry about. You just define the table that you want to migrate data to and then select Migration Fields from the Migration button menu. In here you define the fields that will be imported. When you open the screen you will get a list of all the fields and you just need to flag those that will be included.

Once you have this setup you can select the various pieces of master data to create files for and use the functions to Export to XML or Export to Excel. Again I am using the Excel option. If you select multiple tables you get a separate tab in the resultant file for each table. If any of the tables already have data on them this is included in the exported file, useful for that data that was setup in the config XML file earlier where you want the customer to review the defaults and amend them.

This is given to the users and once filled in you can import the data to NAV.


The final part of the Toolkit allows for Templates to be built. These again refer to specific tables.
You can define multiple Templates for each table if required. In the Template you can define which fields are Mandatory and what default values they should have. The Templates can be used as part of the Data Migration to complete data the user may not know about, but they can also be used after the migration.

As an example we have a number of templates for the customer Table. Each defines slightly different posting groups and payment terms. From the Template you can Create an Instance. This will create a new record on the Customer table and pre-fill all the information form the template. alternatively from the customer you can run a function Apply Template. This give yo a list of templates for that table and when you select one it updates the current record with the defaults from the Template.

With a little bit of work the Template's could be made more useful still, especially in environments where new master data, e.g. Customers and Items, are regularly setup as it would reduce the amount of keying of default data while still allowing flexibility.