2-Controlware 10.03 is available for versions 2016, 2017, 2018 of Dynamics NAV and Fall ’18 (R13) and Spring ’19 (R14) of Business Central On Premises. Versions for 2013 R2 and 2015 are available on request. Dynamics NAV version 3.xx, 4.xx, version 5.xx, NAV 2009, NAV 2009 R2 and NAV 2013 are no longer supported.
Our 2-Controlware software is in long term support mode. This means that there will be no additional releases and only high priority blocking issues will be fixed on customer basis.
The 2-Controlware modules require the basic read and execute permissions for the objects. Please be sure to add the following tables with full access permissions for all users:
Field and Dataset Security
Table ID | Table Name | Permissions |
11112031 | 2C Secured Table Per Source | Read, Insert, Modify, Delete |
11112032 | 2C Secured Field Per Source | Read, Insert, Modify, Delete |
11112033 | 2C Secured Dataset Per Source | Read, Insert, Modify, Delete |
Mandatory Fields
Table ID | Table Name | Permissions |
11112045 | 2C Mandatory Table Per Source | Read, Insert, Modify, Delete |
11112046 | 2C Mandatory Field Per Source | Read, Insert, Modify, Delete |
11112051 | 2C Error Message per User | Read, Insert, Modify, Delete |
The general setup of 2-Controleware can be done after importing and compiling the objects and when a correct license is active.
In this window you can manage the general setup of 2-Controlware Compliance.
Calculation of Change Log:
The calculation of the Change log with 2-Controleware is standard done in our codeunit 2C Trigger Management. If Some partners experienced problems with the calculation of the change log functionality in. We added a Compliance Setup function with which you can optionally turn this calculation off and perform the calculation of the change log functionality on another place (such as codeunit 1).
The following options are available:
Numbering
Table Relation Nos.:
Select the Number Serie that you want to use for the Table Relations. Advise is to use Manual Numbers. It is common to use the table numbers in the field No. from the Table Relations. For example Table Relation T38 – T39.
Windows Groups
2CW can use account group membership from Active Directory in the modules Authorization Management and Authorization Monitoring. This information is retrieved from the AD on demand, but it can also be retrieved on a regular basis by setting the field Process with Job Scheduler.
The field Last Update presents only the moment of the last complete update.
By entering a value in the field Maximum Result LDAP Query the number of results from a LDAP query might be limited. In large environments a LDAP query might return an unexpected large number of results, which could create an overflow in NAV.
If you add a condition to the check moment, you can modify the validation field to empty. Simply deleting the field is, depending on the field type, not always enough. Sometimes a value is required. We have compiled a list of empty values, but you can modify the values if necessary.
Modify on the FastTab Default Empty Values the initial settings for empty values.
To save the current settings of the modules we have added dataports or XMLports to the menu. The port can export and import a file containing the settings. It overwrites the existing records and does not check on consistency, so be careful if you use it. You can locate the ports in the menu Departments - Compliance in the setup section of the concerning modules.
The following dataports and XMLports are available:
11111980: Export/Import Role/Permission
Table 2000000004 Permission Set
Table 2000000005 Permission
11111983: Export/Import Authorization Management Settings
Table 11111979 2C User Profile
Table 11111978 2C User Role per User Profile
Table 11111977 2C User per User Profile
11111987: Export/Import Authorization Actions
Table 11112077 2C Auth. Template Header
Table 11112078 2C Auth. Template Line
Table 11112080 2C Auth. Template Line Value
Table 11112083 2C Auth. Template Dimension
Table 11112084 2C Key Definition Header
Table 11112085 2C Key Definition Line
Table 11112081 2C Batch Action
11111988: Export/Import AM Pro Settings
Table 11112067 2C Authorization Mgt Pro Setup
Table 11112079 2C Org. Dim. Value Line
11111989: Export/Import Organization Structure
Table 11112068 2C Org. Dimension
Table 11112069 2C Org. Dim. Translation
Table 11112070 2C Org. Dim. Combination
Table 11112071 2C Org. Dim. Value Header
Table 11112072 2C Default Org. Dimension
Table 11112079 2C Org. Dim. Value Line
11111984: Export/Import Field and Dataset Security Settings
Table 11112015 2C Secured Table
Table 11112016 2C Secured Field
Table 11112017 2C Role per Secured Field
Table 11112019 2C Secured Dataset
Table 11112018 2C Role per Secured Dataset
Table 11112020 2C User Filter
Table 11112021 2C User Filter Line
11111986: Export/Import Authorization Call Management Settings (XMLport only)
Table 11112044 Authorization Call Setup
11111985: Export/Import Authorization Monitoring Settings
Table 11111990 2C Organization Type
Table 11111982 2C Process
Table 11111983 2C Sub Process
Table 11111984 2C Standard Competence
Table 11111985 2C Allowed Permission
Table 11111986 2C Conflicting Competence
Table 11111987 2C User Profile per Competence
11111979: Export/Import Competences
Table 11111984 2C Standard Competence
Table 11111985 2C Allowed Permission
11111978: Export/Import Conflicts
Table 11111986 2C Conflicting Competence
11111990: Export/Import Mandatory Fields Settings
Table 11112048 2C Mandatory Table
Table 11112049 2C Mandatory Field
11111991: 2C Exp/Imp PE Settings
Table 11112060 2C Permanence Setup
Table 11112061 2C Perm. Templ. Header
Table 11112062 2C Perm. Templ. Alloc. Line
Use the Job Queue of Dynamics NAV to schedule the execution of Codeunit 11111990 2C Job Scheduler. We advise to run the job on a weekly schedule.
To activate the Job Scheduler for Compliance, Open the window Compliance-Setup by selecting menu Departments - Compliance - Setup - Compliance Setup.
Under the tab 'Windows Groups' you can check the Process with Job Scheduler checkbox to use the synchronization options.
Default Microsoft Dynamics NAV authorizations are setup by defining permission sets (formerly user roles). Permissions for one or more objects can be assigned to each role and a role can be assigned to one or more users. This way users can be authorized with permissions to objects. In many cases authorizations are build and setup on work instruction or organization profile level. This approach often leads to large permission sets and uncontrollable permissions. Experience shows that it is better to define roles on task-level and authorize users by referring to the organization roles (organization roles, permissions and responsibilities). By grouping organization roles into organization profiles, the authorizations that are necessary to execute his or her organization profile can easily be identified per user. Default Dynamics NAV provides no functionality to group roles into profiles
The module Authorization Management simplifies the setup of authorizations. The core of the module inserts a layer between users and roles. This layer is called the organizational roles (formerly user profiles) and it groups permission sets. This way the administrator is able to create profiles that are (almost) equal to the organization roles within the organization. Authorizing users is from then limited to link users to a organizational role, rather than an often uncontrollable number of roles.
The recording of Permission Sets makes it possible to add new permissions within your Dynamics environment.
Beforehand there are a number of things that are important to set up or have at your disposal:
Super rights to your “test account”.
You must have your Dynamics environment with the compliance module of 2-Control at hand to start recording here too.
Optional: Two Dynamics accounts - one to start recording and one to perform the steps.
To record a Permission Set, you must first start recording in Dynamics and select the user whose steps you want to record. To start a recording follow the steps below:
In the search bar, search for ‘Permission Sets’
You can then click on a Permission Set or make a new one.
Click on the Action tab at the top
Click on the Start button
The recording in Dynamics had now started. The user now performs his actions to get Table Data in the Permission Set.
To clean up permission sets, you can choose the set from the dropdown menu to ensure that pages and reports from Permission Sets are filtered and only remains the usable Table Data. An example of a permission set that can be filtered is: 2C-GEN-ALL.
When you have stopped recording you will see the screen below. You can see the details of all the Table Data that was added to the Permission Set during recording.
If you have closed the recording session in Dynamics you will return to your NAV environment. By clicking on the “Stop” button, your recording will now also be closed here.
You can use Organizational Roles (formerly user profiles) to group roles to an organization function. This way you can easily authorize users.
This procedure explains how you can create an Organizational Role and how to link Permission Sets (formerly user roles).
To create an Organizational Role within the NAV compliance add-on, go to Departments > Compliance > Security > Authorization Management > Organizational Roles. In this screen you have a overview of al the defined Organizational Roles.
You can edit Organizational Roles or create new Organizational Roles. To create a new Organizational Role click New (or Ctl + n).
1. Complete the following fields in the header:
Organizational Role: Clearly and recognizable code,
Description: e.g. Purchase Employee.
2. Link Permission Sets: Click in the sub window Permission Sets on arrow down button in field Permission Set ID to select the Permission Set you want to link.Repeat this activity till you have linked all desired Permission Dets.With the option All Companies it is possible to indicate that the Permission Set applies to all Companies. This option overrules the company authorization of the Organizational Role.
3. Link users: Click in the sub window Users on the arrow down button in the field User ID to select a user you want to link.Complete (optional) for the user the fields:
Company(Group): Add a company name to set company specific security. If it is empty the security is for all companies.
Starting Date: Used for starting the assignment of the Permission Sets at a specified date.
Ending Date: Used for ending the assignment of the Permission Sets at a specified date.
Repeat this activity till you have linked all desired users, it is possible to link database and windows logins in this window.
After all applicable Permission Sets and users are linked to the Organizational Role, the profile can be released after acceptance by the organization. A released Organizational Role cannot be modified.
Release the organizational role by selecting Actions > Functions > Release (or Ctrl+F9).
You can only release (and synchronize) Organizational Roles if all linked Permission Sets are released. Inspect Manage Permission Sets.
A released Organizational Role should be synchronized to activate the authorizations for the linked users. Only Organizational Roles with statuses Released or Synchronized can be synchronized.
Synchronizing actually assigns the Permission Sets to user accounts as configured in the Organizational Roles. Synchronizing a link which is already assigned reapplies the setup to the user account.
You can synchronize the current Organizational Role by selecting Actions > Functions > Synchronize Organizational Role (or F9). The field Sync Status on the header changes from To be synchronized to Synchronized All.
The module Authorization Management helps you to manage your users in a transparently and clearly manner. You can easily insert, modify, delete and authorize users.
To create a user go to Departments > Compliance > Security > Authorization Management > Users (2C). Click on the button New (or Ctrl + n).
Choose login type: this field indicates which type of user you are managing (Windows Login or Database Login). Modifying this field is only possible during insertion of the user. Enter the ID of the user depending on type:
Database Login: You can directly enter an ID.
Windows Login: You can insert a user by selecting the lookup button of field User ID and selecting a user from the Active Directory of Windows.
Inserting a new Windows-user is only possible in Classic. The Active Directory table is not accessible in RTC. Complete the other fields in the header. Only applicable for type Database Login:
Name: The name of the user.
Password: The password of the user.
Expiry date: (Optional) the expiration date of the login.
Note! If Dynamics NAV is used in combination with Microsoft SQL-Server, a new database user should be added at the SQL-Server first.
You can link Organizational Roles to the user in the sub window Organizational Roles (formerly User Profiles). The sub window Assigned Permission Sets (formerly user roles) shows the roles that are assigned to the user through the Organizational Roles and are just shown as information. Permission Sets that are assigned under water (by the Dynamics NAV Classic security menu Extra - Security) are shown in red. These roles are not assigned to the user by the module Authorization Management and are not linked with an Organizational Role.
Per line you can enter an Organization Role and link the role to the user. Optional fields are:
Company(Group): Add a company name to set company specific security. If it is empty the security is for all companies.
Starting Date: Used for starting the assignment of the organizational role at a specified date.
Ending Date: Used for ending the assignment of the organizational role at a specified date.
To activate the authorization of an user, synchronize the user.
Membership of groups in Active Directory can be seen for the active user card. When the active card is a group, both account membership and membership of other groups can be shown. Use the buttons under Actions > Functions > Process.
It is possible that error messages arise in the Permission Sets that you have included in your Dynamics Environment (or Authorization Box). In this Walkthrough, you will learn how to resolve an error message from a Permission Set with too few permissions.
In this Example, the Permission Set 2C-SLS-ORDER is added to the user. When you go to Sales Order and click on the ‘New’ Button in the Dynamics Environment, the error message below will appear.
The message indicates that the user has no Modify permissions on the header of the Sales Order. However, It is desirable to be able to create a sales order, the permissions must be added to the Permission Set 2C-SLS-ORDER.
To add access permissions to the Permission Set, the following steps must be completed:
Step 1: Right at the top of your Dynamics environment, search for ‘Permission Sets’ and click on it.
Step 2: Select the Permission Set you want to edit (in this example 2C-SLS-ORDER) and click on Permissions.
Step 3: You can now add the necessary objects from the error message. You do this by choosing Object Type: Table Data. Via Object ID you can search for the required table (Object Name) to which you want to give permissions (in this case: Object ID 36 Sales Header).
Step 4: You need Modify permissions to create a Sales Order. Under Insert Permission, click the down arrow and then click ‘Yes’. It may happen that you want to set the permissions to ‘Indirect’, but in this example we choose ‘Yes’ because the user must be able to create a Sales Order directly.
The error message from this example has now been resolved. If you are going to resolve error messages, it is important that you know which table (Object Name) appears in the message. You also need to know which permissions you want to add (Read, Insert, Modify, Delete or Execute).
With the option Export Authorization matrices to Excel you can view the authorization setup in Excel. Both, roles per profile and users per profile might be exported. Furthermore, you can filter on permission sets, permission sets per organizational role, organizational role, users per organizational role and users.
Requirements for export authorization matrices to Excel are:
To export the Authorization matrices to Excel go to Departments > Compliance > Security > Authorization Management. Click on Export Authorization matrices to Excel.
In the next screen you have four options:
You have also the option to set specific filters. If all the filters are set, click OK for export to excel.
Also you can set filters in the menu. There are options to set filters for: Permission Set, User Role per User Profile, User and User per User Profile.
During the implementation of the Authorization Box more environments are used. Building Permission Sets and testing is normally done in a test environment. Acceptance in the accept environment and go live with the production environment. It is possible to migrate the Authorizations and setup to a different environment. This involves Dynamics NAV/Dynamics 365 BC and the Authorization Box database connection.
To migrate the authorizations you can follow the steps below:
1. Check if all Organization Roles are synchronized;
2. Export the Permission Set/Permission and Authorization Management Settings via Compliance/Security/Authorization Management under Setup:
3. Create a RapidStart package for the Permission Sets and Permissions. For version 2016 and 2017 these are tables 2000000004 and 2000000005. Export this package;
4. Export the Field and Dataset Security via Compliance/Security/Field and Dataset Security Import/export Field and Dataset Security settings;
5. Write down the numbering that is used;
6. Login with a SUPER account without company restrictions. This SUPER will be kept as long as you do not delete this Permission Set, so beware!
7. Check the settings (Authorization Management, Field and Dataset Security);
8. Back up (if desired) the Access control table (table 2000000053) from the Development Environment. Run this table and copy all rows and then paste them into Excel;
9. Export (if desired) the setup from the new environment as a backup for when the import from the other environment would not work. See steps 2 and 3;
10. Run report 11111993 from the development environment to cleanup some settings. Determine what should be removed. Select everything except the first option. Click Preview to delete the data:
11. Delete all Permission Sets (user roles) except SUPER . Filter on <> SUPER and then right mouse button, Delete;
12. Check whether all users have been created in the new environment and choose refresh users in the Users(2C) overview;
13. Import the RapidStart package containing the Permission Sets and Permissions from step 3. First the Permission Sets (Select line, Functions, Apply data), then the Permissions (Select line, Functions, Apply data);
14. Release all Permission Sets (2C) via Compliance/Security/Authorization Management/Permission Sets (2C). Select them all and click on Release Selection;
15. Import the Authorization management Settings via Compliance/Security/Authorization Management, Export/import Authorization Management settings. (This gives the user his Organization Roles with the linked Permission Sets);
16. Check if you are linked to an Organization Role. If so, your permissions can be removed by the system (depending on the Authorization Management, Synchronization Type setting). Make sure you keep SUPER. Make sure you are linked to an Organization Role (without company restriction) that has the SUPER Permission Set linked to it;
17. Release all Organization Roles. This can be done by running the 2C User profile table (11111979) from the development environment and setting all lines to Released status;
18. Synchronize all Organization Roles (Compliance/Security/Authorization Management/Organizational Roles) via the Synchronize All button;
19. Remove all Field, Dataset Security and mandatory Fields. This is only necessary if adjustments have been made to this!
20. Check if the number series (see step 6) exist in this new environment?
21. Import the Field and Dataset Security settings. Note: This is only necessary if adjustments have been made within Field and Dataset Security;
22. Review the number series for Field and Dataset security. Make sure the Last No. used matches the highest number used in the Field and Dataset security.
23. SUPER Award:
Must not be assigned to Windows groups, otherwise users can still have SUPER;
Check which accounts have SUPER: Compliance/Security/Authorization Management/Permission Sets (2C), column No. of linked Users for Permission Set SUPER;
24. Check whether all users have an Organization Role: Compliance/Security/Authorization Management/Users (2C), column No. of Linked Organization Roles;
25. Check whether users are linked to all Organization Roles: Compliance/Security/Authorization Management/Organizational Roles, column Number of linked Users;
26. Check that all Mandatory fields and the Field and Dataset securities are active (start date in the past and no end date);
27. Check the User Personalization’s (RTC profile and client language), User setup (authorization, posting from/ to, etc.) and Warehouse Employees where applicable.
Go to the Dynamics NAV/BC test environment where the new Permission Sets are build
1. Export the Permission Sets
2. Export the Field and Dataset Security (if applicable) via Compliance/Security/Field and Dataset Security Import/export Field and Dataset Security settings;
3. Check the numbering that is used:
4. Export the Mandatory Field Settings (if applicable) via Compliance/Control/Mandatory Fields Export/Import Mandatory Field Settings;
5. Check the numbering that is used for Mandatory Fields:
Go to the Authorization Box (connection with Test database)
6. Make a backup from the test environment via Setup, Backups, Export and select what you want to export:
7. Make an export from the Organization Chart via Authorization Framework, Organization Chart and Press on the Export/Import button next to the Organization Chart Name and then Export Structure for an export with Permissions Sets per Organization Role and Users per Organization Role. The assumption is that all users have the correct permissions in the test environment.
Go to the Dynamics NAV/BC production environment
8. Import the Permission Sets by importing the package from step 1 containing the Permission Sets and Permissions. First the Permission Sets (Select line, Functions, Apply data), then the Permissions (Select line, Functions, Apply data). In Dynamics 365 BC you can use the Import functionality from the Permission Sets;
9. Remove all Field, Dataset Security and mandatory Fields if a if applicable. This is only necessary if adjustments have been made to this;
10. Check if the number series (see step 3 and 6) exist in this new environment;
11. Import the Field and Dataset Security settings;
12. Review the number series for Field and Dataset security. Make sure the Last No. used matches the highest number used in the Field and Dataset security;
13. Import the Mandatory Fields Settings;
14. Review the number series for Mandatory Fields. Make sure the Last No. used matches the highest number used.
Go to the Authorization Box (connection with production database)
15. Make sure you have a working connection with the production database;
16. Setup the Approval settings and Approvers. For the import of the initial structure we recommend to process the changes without approval;
17. Check the Permission Sets via Authorization Framework, Permission Management, Permission Sets;
18. Import the backup via Setup, Backups, Import and select the file from step 5:
19. Check the Organization Chart;
20. Check whether the Permission Sets are linked to the Organization Roles;
21. Check the setting ‘Default overwrite Current Permissions’ via Setup, General, Edit. With this setting users will only get permissions bases on the assigned Organization Role. Make sure the correct users keep SUPER!
22. Import the Users per Organization Role. Use the file from step 6. Note: In both environments the users and the companies must be the same;
23. Check the processed Authorization Requests;
24. Change the approval settings if required.
Authorizations are setup to a concept where employees / organization profiles are responsible for certain data in Dynamics NAV. In addition, the authorizations ensure the segregation of duties in the organization. Default Dynamics NAV has no functionality to view the quality of the setted authorizations. This way the organizations have no instrument to control their authorizations. The module Authorization Monitoring provides a real-time functionality to monitor the quality of the assigned permissions to users. The module needs to be setup before use: generic, standard competences (who can do something?) and conflicts (which segregation of duties is reversed?).
Based upon setup of standard competences and conflicts separate analysis can be made. The assessor can evaluate and comment the results.
For analysis, all object authorizations in Dynamics NAV are processed, based upon the process or standard competence chosen. The results present all users who answer to the question asked in the standard competence and how through which role and profile they have those permissions. These results can be evaluated: agree or not agreed, including optional arguments and further comments. Storing all evaluations makes it a traceable process who when evaluated what with which results. Any field- and dataset security (another module of 2-Controlware) used is also visible in the results.
In a standard competence a question is asked, for example who can modify items? This is described by adding the permissions (read, insert, modify, delete and execute) required to do so. Standard competences also contain the process of which the activity is part of, settings for management of the analysis and for simplifying the evaluation. For each standard competence a organizational role (formerly user profiles, from 2-Controlware Authorization Management) which is correct in having those permissions might be set up, e.g. for modifying items.Examples of standard competences:
The auditor might ask questions such as: * Who is able to modify items?
Who is able to post purchase invoices?
For the first question, a line with object type Table Data and object id 27 (table Item) on the standard competence card is necessary. Put Yes at modify on the line for object id 27. The results will present all users able to modify a field on the item card. Insert and delete are not necessary to modify a field. * All users need permission to read from table data item. All users in the system will show up in analysis results for read permission. The read permission it is useless to change anything, but it is essential to open the item card and for using information from the table in other parts of the system, e.g. to create a line on a purchase order.
Objects of the type table data cannot be executed, therefore it does not make sense to analyze these permissions.
The above mentioned presumes direct permissions, which users need to perform an action on an item card or a list. Dynamics NAV also knows indirect permissions. These act as a service hatch: you cannot act yourself, but code does this for you. In other words: the code decides what field is modified, of which record and what the new value will be. This is often used for (cost) prices and logistic information: those item properties are administered by the system. Moreover, direct permissions also are sufficient to perform tasks which require indirect permissions, but this does not work the other way around.
The second example requires a different approach. Most posted documents are stored in two tables: the header and the line. Posting a document means creating new lines in those dedicated tables, apart from creating several types of entries. However, those entries are used for a multitude of actions, analyzing these permissions gives unreliable results. Posting a document is usually done by code, so you need to analyze indirect permissions. In the example you need to check on table 122 (Purch.Inv. Header) and 123 (Purch. Inv. Line).
As auditor, for translating your question into a standard competence you can open the page you want to analyze the permissions of. Then hit shortcut key Ctrl + Alt + F1 or open the application menu, Help, About this page.
Apart from analysis of permissions, the module Authorization Monitoring can also search for conflicting competences. These conflicts are configured on the standard competence or using a conflict matrix. An example of a conflict is if a user has permissions to process payments and modify the bank account of the beneficiary. Both need a standard competence for analysis. Set up of the combination of standard competences enables analysis of the conflict.
By searching conflicting competences in the object authorizations the software analyzes the permissions for both standard competences configured in a conflict. The results are composed of users in both result sets. Evaluation of those conflicts is identical to analysis of permissions.
You can create a Standard Competence and link . for creating Standard Competences go to Departments – Compliance – Security – Authorization Monitoring – Standard Competence. In this screen you have an overview of all Standard Competences.
To create a new Standard Competence, you must follow the following steps:
Click on the ‘New’ Button.
Fill in the fields as in the example below (or as you wish). For more information about the field, read: Work with Standard Competences.
Click on OK.
You have now created a Standard Competence on the table ‘Item.
When you go to Departments – Compliance – Security – Authorization Monitoring – Standard Competence per object, you can see in the overview at TableData 27 (Item) that one standard competence has been linked that you have created.
The Standard Competence can now be used for an analysis of permissions or conflicts.
You can create a Conflicting Competences in the screen of Standard Competences. Go to Departments – Compliance – Security – Authorization Monitoring – Standard Competence. Next to the ‘New’ button in the overview screen of the standard competences, you can click on Conflicting Competences to create a new one.
To create a new Conflicting Competence, you must follow the following steps:
Click on the button ‘Conflicting Competences’
You will now see the list of conflicting competences. Click on ‘New’ or click in an empty line on the down arrow under the column Competence.
Select a Competence and then under ‘Conflicts with’ the Competence that should show conflicting results in an analysis.
Then fill in the other fields to provide additional information to the conflict, such as ‘Internal Control Impact’ and ‘Conflict Reason’.
You have now created a Conflicting Competence. You can now use this when analyzing Conflicts.
After defining the Standard Competences the system can analyze the permissions. To analyze Permissions go to Departments – Compliance – Security – Authorization Monitoring. Then click on ‘Analysis Permissions’ under tasks.
The monitoring software generates results based on the objects (permissions) linked to the Standard Competences. Results are depending on the setting Check Method (Standard or Object) on the Critical Permission and the setting And/Or RIMDX on the objects.
There are three Source Types of permission results:
Permission Set: the linked object is found in the Permission Set
Organizational Roles: the Permission Set (with the linked object) is found in the Organization Role
User: the Permission Set (with object) is linked to the User.
To analysis Permissions, you must follow the following steps:
First, set the desired field under the heading ‘General’. The Standard Competence Filter is important because it can be used to analysis a specific Competence. If it is empty then all Standard Competences will be analysed.
You can also choose to specify your analysis by clicking check boxes, such as calculate with Excluded Permission Sets (e.g. results from user with SUPER Permissions).
When you have set up everything, Click on ‘Calculate’. Dynamics will now analyse the permissions.
After running the analysis you will see an overview of all results as in the example below.
You have now analysed permissions. If you want to filter specifically on a Source Type or Finding, you can use the filters below:
After defining Conflicts the system lists al the conflicts and the user can analyze the conflicts. To analyze conflicts To analyze permissions go to Departments – Compliance – Security – Authorization Monitoring. Then click on ‘Analysis Permissions’ under tasks.
To analysis Conflicts, you must follow the following steps:
You have now analysed Conflicts. If you want to filter specifically on a Source Type or Finding, you can use the filters below:
After an initial analysis you will find the results with status To be reviewed or Agree or Disagreed. Agreed means that based on the setup the result has no risk. If you have a lot of analysis results, it is an option to first set up ‘Default Evaluation for Accepted Finding’ in a Standard Competence. After a new analyze more results will have the status Agreed and less To be reviewed.
You can evaluate the findings for Critical Permissions. Departments – Compliance – Security – Authorization Monitoring. Then click on ‘Analysis Permissions’ under tasks. You are able to add your review by clicking in the Evaluation Column.
You have the following choices for the evaluation:
To be Reviewed
Agreed
Disagreed
Actual Permissions Changed
Standard Changed
Actual Permissions and Standard Changed.
Depending on the changes you have made, these evaluations can automatically come out of the analysis.
You have now evaluated permissions. If you want to filter specifically on an Evaluation, you can use the filters below:
If something has changed in the permissions, authorizations or standards, click on the button from the example below at the top of the screen to create a change log overview:
The review of the Conflicting Critical Permissions can be done in the same way as for Critical Permissions. You can evaluate the findings for Conflicts. Departments – Compliance – Security – Authorization Monitoring. Then click on ‘Analysis Conflicts’ under tasks. You are also able to add your review by clicking in the Evaluation Column.
In the window Authorization Monitoring Setup you define the setup to work correctly with the module Authorization Monitoring.
Open the window Authorization Monitoring Setup by selecting menu Departments - Compliance - Security - Authorization Monitoring - Setup - Authorization Monitoring Setup.
Select the role with permissions to modify evaluations in the field Reviewer. Only users linked to this role are allowed to modify evaluations in the Analysis windows.
Automatic Evaluation Posting: check this field to automatically post new evaluations in the Analysis windows.
Register Evaluations: select ‘Changed’ to register only changed evaluations (since last posting) in the window Analysis Permissions, select ‘All’ to register all evaluations.
The field Action by Agreeing profile enables you to define a default action for automatic acceptance of a profile: None (empty), Agree Users, Agree Permission Sets or Agree both. A default action for agreeing a profile might be configured as a generic option and overruled per standard competence.
Automatically accepted findings are registered with the evaluation ‘Agreed’ by default. This might be changed to ‘To be reviewed’ in the field Default Evaluation for Accepted Finding.
Manually Calculate Field Security: check this field to calculate the field- and dataset security setting manually (one time) instead of each time during analysis.Calculate by selecting Actions, Functions, Calculate Field Security.
The module Authorization Monitoring is based on the organization types of Starreveld. The organization types can be linked to a process. Organization Types can only be deleted if they are not used (so if they are not linked to a process).
Follow the steps below to setup organization types:
Open the window Organization Types by selecting menu Departments - Compliance - Security - Authorization Monitoring - Setup - Organization Types.
Insert a new organization by using the standard Dynamics NAV method (Ctrl+N).
Define the organization type for your organization.
To judge the quality of the authorization per process and sub process, the processes and sub processes have to be defined in Dynamics NAV. It is necessary to define the processes and sub processes in collaboration with the organization. The defined processes and sub processes are entered in the module Authorization Monitoring and linked to standard competences. Processes and sub processes can only be deleted if they are not used (so they are not linked to a standard competence).
Follow the steps below to setup processes:
Open the window Processes by selecting menu Departments - Compliance - Security - Authorization Monitoring - Setup - Processes.
Insert a new process by the standard Dynamics NAV method (Ctrl+N).
Define a clearly and recognizable code and description.
Select in the field Organization Type by using the arrow down buttonthe organization type that is applicable to this process.
To judge the quality of the authorization per process and sub process, the processes and sub processes have to be defined in Dynamics NAV. It is necessary to define the processes and sub processes in collaboration with the organization. The defined processes and sub processes are entered in the module Authorization Monitoring and linked to standard competences. Processes and sub processes can only be deleted if they are not used (so they are not linked to a standard competence).
Follow the steps below to set up sub processes:
Method 1: Through menu items
Open the window Sub Processes by selecting menu Departments - Compliance - Security - Authorization Monitoring - Setup - Sub Processes.
Insert a new sub process for the desired process with the standard Dynamics NAV method (Ctrl+N).
Select in the field Process with arrow down button the process applicable for this sub process and define a recognizable code and description for the sub process.
Method 2: Through window processes
Open the window Processes.
Select the process from which you want to manage the sub processes.
By selecting Related Information, Process, Sub Processes.
Define like in the first method the sub processes now.
In this window you can setup permission sets (formerly user roles) that will be excluded from the analysis of the permissions. If the organization has accepted some risks, those risks do not have to be shown in the analysis. The role SUPER for example is always an actual risk, users with this role are able to modify all data in the database. If this role is included in the analysis, it always shows as an actual risk, which leads to an unclear analysis. For this reason you can exclude permission sets from analyses.
Follow the steps below the setup the Excluded Permission Sets:
Open the window Permission Set Setup by selecting menu Departments - Compliance - Security - Authorization Monitoring - Setup – Excluded permission sets.
Press Ctrl+N to create a new permission set setup.
Select in the field Permission Set ID by using the arrow down button the permission set that you want to exclude from the analysis.
Each organization has a personal interpretation of the quality of the authorization. Assessing the quality is therefore based on the presence of standards which the quality is assessed.
The desired quality of the authorizations is specified in the module Authorization Monitoring by standard competences. These competences are used to determine the quality around it to own the responsibilities for data.
Tip! When you are setting up the standard competence make sure that you create distinction between standards for the data ownership (e.g. item management) and to defining segregation of duties (e.g. purchase and receive).
In the window Standard Competence you can define standard competences and link allowed permissions. By linking organizational roles (formerly user profiles) to the Standard Competence you can make users responsible for certain data. Additionally, you can set function conflicts by linking Standard Competences.
To be able to monitor the responsibility of users for data and tasks, the organization has to define standard competences. These standard competences can often be derived from the built permission sets (formerly user roles). This procedure explains how you can define a standard competence and how you can assign organizational roles (formerly user profiles) to a standard competence.
Open the window Standard Competenceby selecting menu Departments - Compliance - Security - Authorization Monitoring - Standard Competence.
Fill in the fields as described below:
Code: Enter a recognizable code for the standard competence. It is recommended to use a naming convention, for example: “FIN xxxx”. FIN means Financial process and xxxx can be replaced for a short description of the standard competence. This way you have a well-organized set of standard competences.
Description: Enter a description for the standard competence.
Type: Select a function type (according to the theory of Starreveld): Management, Guarding, Accounting, Executing, Monitoring.
Business Risk: Enter the risk for the organization if a user is linked to the permissions, but not to the standard competence (so the user is able to perform tasks he is not allowed to), for example “Unauthorized changes in item data”.
Business Impact: Select the impact of the Business Risk: High, Average or Low.
Process: Select the process of the standard competence.
Check Method: Select how to check: * Object: Check object by object and by separate permissions.
Standard: Check all allowed permissions together.
Default Evaluation for Accepted finding: automatically accepted findings are registered with the evaluation ‘Agreed’ by default. This might be changed to ‘To be reviewed’ in the field Default Evaluation for Accepted Finding for the opened standard competence. The default configuration might be adjusted in the Actions, Functions, Authorization Monitoring Setup.
Action by Agreeing Profile: optionally, accepting a finding might also accept the findings for users and/or roles, depending on the configuration. This setting per standard competence overrules the generic setting in the Departments - Compliance - Authorization Monitoring - Authorization Monitoring Setup.
The other fields are filled automatically.
According to internal control theories, some authorizations can conflict and should be segregated (segregation of duties). E.g. the authorizations for entering a sales order and posting a sales delivery should not be held by one person. With conflicting competences the correctness of the segregation of duties can be monitored.
To analyse possible segregation of duties conflicts, the module Authorization Monitoring offers the functionality to define these conflicts. Segregation of duties conflicts can be defined in a Standard Competence with conflicting competences, using the window Standard Competence, or with the use of window Conflict Matrix.
Method 1: By standard Competences
Open the window Standard Competence by selecting menu Departments - Compliance - Security - Authorization Monitoring - Standard Competence.
The conflicting competences can be set up by selecting Conflicting Competences.
Press Ctrl+N to define a new conflicting competence.
Select the conflicting competences in field Competence and in field Conflicts with.
Choose the internal control impact and internal control risk, in case the conflict exists.
Define the conflict reason: Internal Control, Hierarchical, Mandatory Principles or Setting Standards.
Method 2: By Conflict Matrix
The conflicting competences can also be managed in the conflict matrix. In this window the lines and columns show the standard competences and a conflict can easily be set up by simply checking the intersection. In the sub window, you can define for each conflict the internal control impact, internal control risk and the conflict reason. When unchecking the intersection, the conflict will be deleted.
Open the window Standard Competences by selecting menu Departments - Compliance - Security - Authorization Monitoring - Conflict Matrix.
In this matrix conflicting competences are setup by selecting arrow down button in the intersection between two competences.
Analyze segregation of duties with the module Authorization Monitoring.
Open the window Analysis Conflicts by selecting Departments - Compliance - Security - Authorization Monitoring - Conflict Matrix.
Select the standard competence in the field Standard Competence Filter.
If you want to further specify the standerd Competences, fill in the Process Filter & Company filter.
If you want to hide non-conflicting competences and show the column name, you can tick it.
Your standard competences will be listed on the lines. If you click in the column next to the two competences you want to create a conflict of, the popup will open. You can now create the desired conflict.
Repeat the steps to create multiple conflicts.
Calculate conflicts
Open the window Analysis Conflicts by selecting Departments - Compliance - Security - Authorization Monitoring - Analysis Conflicts.
Select the standard competence in the field Standard Competence Filter.
To analyse the actual defined conflicts, the analysis of the authorizations has to be calculated. Calculate the window Analysis Conflicts by selecting Actions, Functions, Calculate.The (un)checked field Calculated shows if you are analysing old or actual data. The field is unchecked after changing filters. After recalculating the conflicts the field Calculated is checked.
Analyze conflicts
The window shows the conflicts in the segregation of duties on three levels: User level, Organizational Role level (formerly user profile) and Permission Set (formerly user role) level. The analysis determines based on the actual permissions and the allowed permissions which users, permission sets and organizational roles have conflicting standard competences.
Register the evaluations by selecting Related Information, Register, Register Analysed Conflicts (or with F9). Evaluations might be registered automatically if this is configured in the Actions, Functions, Authorization Monitoring Setup.
Note! You can also execute the analysis directly from the window Standard Competence, Select Related Information, Competence, Analysis Conflictsto open the calculated analysis, filtered by the selected standard competence.
This chapter describes how permission sets can be linked
Every standard competence represents a certain task or certain dataset where to one or more users have the authorizations. To evaluate the quality of the authorizations for these competences, allowed permissions have to be linked to the standard competence. In the sub window of the window %1 allowed permissions can be linked to a standard competence.
Follow the steps below to link the allowed permissions:
Open the window Standard Competence.
Determine the standard competences for a permission set (formerly user role).
Select in the field Object Type the object type of the allowed permission you want to link. In many cases this is the object type ‘Table Data’.
Select in the field Object ID by using the arrow down button the object ID of the allowed permission you want to link. The field Object Name is filled automatically.
Select in the fields Read, Insert, Modify, Delete and Execute the permissions that apply to the allowed permission. Select the minimum required permissions of the standard competence. The permissions Read, Insert, Modify and Delete only apply to the object type Table Data, the permission to Execute is applicable to the other object types.
Repeat these steps until all your allowed permissions you want to monitor are linked to the standard competence.
Attention! It is important to set unique objects to a permission set, in other words, which objects to which permissions are minimum requirement for a permission set? Usually you need the permissions toInsert, Modify andDelete.
Example:
The role Item maintenance includes read, insert and modify permissions on table 27 (Item). Normative terms, the only user who can create and modify items is the Item Manager. Other users have read permission for items. The standard competence Item Maintenance provides the permission to insert and modify on table 27 (Item).
Before checking the actual authorizations, the system needs to know the users that are allowed to have the standard competences. The assignment is not down on user level, but is possible by assigning organizational roles (formerly User Profiles).
Follow the steps below to link organizational roles:
Open the window Organizational Role per Competence.
Link the organizational role to a standard competence by selecting Related Information, Organizational Roles per Competence.
Afbeelding
Add or delete a organizational role with the standard method of Dynamics NAV (Ctrl+N or Ctrl+Del).
Default Dynamics NAV offers no possibility to secure fields within tables. The default authorization of Dynamics NAV is often experienced as too much. The possibility to secure datasets in tables in Dynamics NAV is missing.
The module Field and Dataset Security offers the possibility to secure fields and datasets in tables. With the module Field and Dataset Security the data owners are able to manage their field security and dataset security. The security officer is authorized to create new field securities and to assign a data owner to a field and dataset security. The starting and ending date indicates when the field security is valid.
To make use of Field & Dastaset Security you need the folllowing permissions.
1. Open the window Field and Dataset Security Setup by selecting Departments - Compliance - Security - Field and Dataset Security - Setup - Field and Dataset Security Setup.
2. Enter in the field Security Officer the permission set (formerly user role) that is used for security officers. The security officer is authorized to create and link data owners to new field and dataset security.
3. Choose Permission Sets with Access in the field Filter Assign Permission Sets to show only roles with access to the table of the field or dataset security when you are linking permission sets. The option All Permission Sets shows all available permission set.
4. Assign on the FastTab Numbering the number ranges used for field and dataset security. You can use these number ranges to assign numbers to the field and dataset security automatically.
Define on the FastTab Default Checks the initial settings for new Field and Dataset securities.
Field Security Checks * Insert Check: check on insert. * Modify Check: check on modify. * Delete Check: check on delete.
Dataset Security Checks * Insert Check: check on insert. Only valid if filter field is part of a Key. * Modify Check Current Value: check on modify the current value of filter field. * Modify Check New Value: check on modify the new value of filter field. * Delete Check: check on delete.
Open the window Field and Dataset Security Setup and open the fasttabs - Calculate for Source Types and Calculate for Security Types.
Each checkbox represents a type which could be included in the summary. The example shows all boxes checked.
Field and dataset securities should not be setup for every table. For example, entry tables should not be secured for performance reasons and because they cannot be altered with customer licenses. In the Table Categories all standard tables of Business Central are categorized and if applicable blocked for field security. An import file with the default Business Central tables is provided by your Microsoft Business Central partner.
This procedure explains how you can import and define table categories.
Default NAV tables we categorized in several categories and block setup for several tables which might pose performance problems. Follow these steps to insert this setupDeviations might also be set up here.
1. Open the window Table Categories by selecting Departments - Compliance - Security - Field and Dataset Security - Setup - Table Categories.
2. Import the table categories by selecting Actions, Functions, Fill Table Categories.
3. Press Ctrl+N to create a table category.
4. Select in the field Table ID the table you want to add. The field Table Name is filled automatically.
5. Check the field Blocked if this table should not be used in field and dataset securities or for mandatory fields.
6. Enter in the field Module to which Business Central module the table applies.
7. Select the field Table Type the table type that applies to the table.
There are 2 windows to inspect and view the applied Field and dataset Securities. Those windows are seperated in Field Securities and Dataset securities. Pressing Calculate summary results in “refreshing - calculating” the applied securities.
Open the window Summary of Secured Fields per Source by selecting Field and Dataset Security - Summary of Secured Fields per Source.
Open the window Summary of Secured Dataset per Source by selecting Field and Dataset Security - Summary of Secured Dataset per Source.
Navigate to a Field Security by opening the window Field Security by selecting Departments - Compliance - Security - Field and Dataset Security - Field Security.
A Field Security contains the following fields. Configure the Field Security to your preferences.
No.: Number of the Field Security, this can be automatic and / or manual, based on the number series settings.
Description: A field to describe the Field Security.
Table ID: You can select the ID of the table that you want to secure on field level.
Table Caption and Module: Automatically filled by the Table ID. You can't modify this.
Note! Blocked tables in the Table Categories cannot be used for field security, If you select a table that is not categorized you will receive a message.
Filter Field No.: This field is optionally, you can secure a part of the table by using a filter.
Filter Value: This field is optionally, fill in the specific values of the field from the Filter Field No.
Module: Automatically filled by the Table ID.
Table Type: you can retrieve the available options by entering 0|1|2|3|4|5|6|7|8|9.
Data Owner: Select the Data owner, the Data Owner is entitled to manage the field security for the table.
Default Editable: Select the option whether all fields of the table, that are not setup in the field security lines, will be editable or non-editable.
Start Date: Fill in the date from which the field security applies (required).
End Date: Fil in the date until which the field security applies (optional).
Changing the checks in the header applies also modifies the checks in the assigned permission sets.
Insert Check: Secure table by insert actions.
Modify Check: Secure table by modify actions.
Delete Check: Secure table by delete actions.
Complete these fields per line:
Field No.: Select the field to be secured.
Editable: Select the option. you can choose between editable or non-editable.
Initial Field Entry Allowed: Select the option whether the field might be filled and once it is filled it cannot be modified. If this option is not selected, the field security blocks any modification of the field.
No. of Assigned Permission Sets: Shows the amount of assigned Permission Sets.
Deviating Trigger Checks: Shows if checks differs from checks selected in header.
Description: you can use this for documentation purposes.
Note! Some fields are hidden by default:
Only Page Control: can disable editability of a field on the page, while it is editable on database level, e.g. from a code unit or a report.
Visibility: Fields might be made invisible on a page (customization might be needed).
No. of Customized Windows: This shows the number of forms on which the invisibility property for the field is programmed (customized).
After defining the security of a field you have to assign the Permission Set(s) for which the security is applicable. you can assign multiple Permission Sets, use for each Permission Set a new line.
Select the specific line of the secured Field. and click on the arrow down button in the field Permission Set-ID to select the Permission Set for which the security is applicable.
You can optionally set some settings of the assigned Permission Set.
Company: Add a company name to set company specific security. If it is empty the security is for all companies.
Starting Date: Used for starting the assignment of the Permission Set at a specified date.
Ending Date: Used for ending the assignment of the Permission Set at a specified date.
Navigate to a Dataset Security by opening the window Dataset Security by selecting Departments - Compliance - Security - Field and Dataset Security - Dataset Security.
A Dataset Security contains the following fields. Configure the Dataset Security to your preferences.
No.: Number of the Dataset Security, this can be automatic and / or manual, based on the number series settings.
Description: A field to describe the Dataset Security.
Table ID: you can select the ID of the table that you want to split up in datasets.
Note: Blocked tables in the Table Categories cannot be used for dataset security. If you select a table that is not categorized you will receive a message.
Table Caption and Module: Automatically filled by the Table ID. You can't modify this.
Table Type: you can retrieve the available options by entering 0|1|2|3|4|5|6|7|8|9.
Data Owner: Select the Data owner, the Data Owner is entitled to manage the dataset security for the table.
Starting Date: Fill in the date from which the field security applies (required).
Ending Date: Fil in the date until which the field security applies (optional).
Changing the checks in the header applies also modifies the checks in the assigned permission sets.
Insert Check: Check table by insert actions.
Modify Check: Current Value: Secure table by modify actions of current values.
Modify Check: New Value: Secure table by modify actions of new values.
Delete Check: Check table by delete actions.
To split the table into datasets, navigate to the lines. Datasets are defined with one or two filter fields with which the permissions of users are restricted.
Note: If you are using the visibility option, some modifications to the applicable forms / pages are necessary in Microsoft Dynamics NAV 2015 and older.
Note! Add the number of lines equal to the number of datasets the table has to be split up.
You have to fill in the following fields per line:
Dataset Type: select whether the dataset is defined for editability or visibility. Visibility is only applicable for customized forms / pages.
Filter Field 1 (and 2): datasets can be defined by one or two fields of the table. Select the field that is used for filtering.
Filter 1 Type (and 2): select Permission Set or User:
Filter 1 Code (and 2): Only applicable if Filter Type = User, this refers to the User Filter that is used to get the user specific filter value.
Filter 1 (and 2): Applicable if Filter Type = Permission Set, enter the filter value that is applicable for the dataset line.
For option fields, e.g. document type and type in table Purchase Line, you can retrieve the available options by entering 0|1|2|3|4|5|6|7|8|9.
Filter And Or: Defines whether the value of Filter 1 and / or Filter 2 is applicable for the dataset. If the Dataset Type is Visible, then only and is allowed.
No. of Assigned Permission Sets: This field shows the number of roles that are linked to the dataset security.
Deviating Trigger Checks: Marked if checks differs from checks selected in header.
You can assign Permission Set(s) for which the dataset security is applicable. you can assign multiple Permission Sets, use for each Permission Set a new line.
Select the specific line of the secured Field. and click on the arrow down button in the field Permission Set-ID to select the Permission Set for which the security is applicable.
You can optionally set some settings of the assigned Permission Set.
Company: Add a company name to set company specific security. If it is empty the security is for all companies.
Starting Date: Used for starting the assignment of the Permission Set at a specified date.
Ending Date: Used for ending the assignment of the Permission Set at a specified date.
In some situations it is better to assign filter values on user level to dataset securities then by permission set (formerly user role). Examples are:
Client wants to implement for all users a detailed dataset security with a lot of different cost centers (e.g. set in Global Dimension 1 or 2).
Client has some users with a deviant filter value from other users with the “same” dataset security restrictions.
We recommend you to investigate the possibility of using a permission set for granting filter values to users (as explained in Setup Dataset Security). If this, e.g. leads to too many permission sets, you can consider using a User Filter.
The maintenance of User Filters consists of two steps:
Define, during the setup of Dataset Security, a User Filter.
Maintain the user-specific filter values for the User Filter.
You can define as many User Filters as necessary.
Note: The use of User Filters leads to an extra effort in the user maintenance during personnel changes.
1. Open the window User Filter List by selecting Departments - Compliance - Security - Field and Dataset Security - Setup - User Filter List.
2. Add an User Filter and complete the fields:
Code : enter an appropriate abbreviation for the User Filter.
Description : enter a description of the User Filter.
The created User Filters can be used in the Dataset Security.
The user-specific filter value is only applicable for users linked to a User Filter. The user-specific filter value overrules a possible derived filter value from an attached permission set.
1. Open the window User Filter List by selecting Departments - Compliance - Security - Field and Dataset Security - Setup - User Filter List.
2. Select the User Filter that needs to be maintained for an user-specific filter value.
3. Open the window User Filter by selecting Function key Ctrl+Shift+V. In this window user-specific filter values can be maintained by adding new users, changing the filter value of existing users or by removing users from the list.
4. Select an existing user or add a new user and complete the fields:
Filter Text: Enter the user-specific filter value applicable for the User Filter.
Note! If the filter value is used for an option field the input has to be in the application language of the specific user.
Company : Add a company name to set company specific security. If it is empty the user filter is for all companies.
Select Dataset Security Checks * Insert Check: check on insert. Only valid if filter field is part of a Key.
Modify Check Current Value: check on modify the current value of filter field.
Modify Check New Value: check on modify the new value of filter field.
Delete Check: check on delete.
Starting Date: Optional a starting date for the filter value can be entered.
Ending Date: Optional an ending date for the filter value can be entered.
The user-specific filter value is only applicable for users linked to a User Filter. The user-specific filter value overrules a possible derived filter value from an attached permission set.
1. Open the window Field Security by selecting Departments - Compliance - Security - Field and Dataset Security - Field Security. The field securities per table can be managed in this window (insert, modify or delete).
2. Insert a new field security for a table by pressing Ctrl+N.
Note: Only the Security Officer is entitled to create a new field security.
3. Complete the following fields in the header.
No.: Automatic and / or manual, based on the number series settings.
Description: E.g. Field Security Table 27 Item.
Table ID: Select the ID of the table that you want to secure on field level. Blocked tables in the Table Categories cannot be used for field security. If you select a table that is not categorized you will receive a message.
Click Yes if you are sure that you want to setup the field security for this table.
4. Filter Field No. and Filter Value: Optionally, only a part of a table can be secured by use of a filter. Select the field to be filtered on in Filter Field No. and fill in the Filter Value.
For option fields, e.g. document type and type in table Purchase Line, you can retrieve the available options by entering 0|1|2|3|4|5|6|7|8|9. The software replaces this with the options from the list if you leave the field, e.g. Quote|Order|Invoice|Credit Memo|Blanket Order|Return Order|6|7|8|9 for document type in table Sales Line. Remove any invalid or not required option.
The type of field in the table is visible when selecting a field.
5. Data Owner: Select the Permission Set (formerly User Role) of the Data Owner of the table.
The Data Owner is entitled to manage the field security for the table.
The field security settings for the table are not applicable for the Data Owner.
Only the Security Officer is entitled to assign a Data Owner to a field security or to create a new field security.
5. Default Editable: Select the option whether all fields of the table, that are not setup in the field security lines, will be editable or non-editable:
Setting is only applicable for users with the linked roles.
7. Start Date: Enter the date from which the field security applies (required).
8. End Date: Enter the date until which the field security applies (optional).
The default checks, defined in Setup Field and Dataset Security are copied when inserting a new field security.
Modifying the checks in the header applies also modifies the checks in the linked permission sets (formerly user roles). Deviating checks in the underlying linked permission sets are marked in the field Deviating Trigger Checks.
In the sub window of the window Field Security, fields of the table can be selected. Per field the security can be defined for editability and visibility. Not linked fields have the Default Editable setting from the header and are visible.
Note: If you are using the visibility option in the field security settings, some modifications to the applicable forms / pages are necessary. Depending on your version and configuration, the option whether the field will be visible or invisible might be hidden.
1. Open window Field Security.
2. Add a line per field.
3. Complete the following fields per line:
Field No.: Select the field to be secured.
Editable: Select the option whether the field will be editable or non-editable.
Initial Field Entry Allowed: Select the option whether the field might be filled and once it is filled it cannot be modified. If this option is not selected, the field security blocks any modification of the field.
No. of Assigned Permission Sets: See below.
Deviating Trigger Checks: Marked if checks differs from checks selected in header.
4. Description: can be used for documentation purposes.
5. Some fields are hidden by default:
Only Page Control: can disable editability of a field on the page, while it is editable on database level, e.g. from a code unit or a report.
Visibility: Fields might be made invisible on a page (customization might be needed).
No. of Customized Windows: This is an information field that shows the number of forms on which the invisibility property for the field is programmed (customized).
After defining the security of a Field you assign the Permission Set(s) (formerly user roles) for which the security is applicable.
1. Open window Field Security.
2. Select the line of the secured Field by selecting Actions (Alt+F10), Line, Assigned Permission Sets (or short key Ctrl+F7).
3. Click on arrow down button in the field Permission Set-ID to select the Permission Set for which the security is applicable.
4. Complete (optional) the assigned Permission Set with the field:
Company: Add a company name to set company specific security. If it is empty the security is for all companies.
Starting Date: Used for starting the assignment of the Permission Set at a specified date.
Ending Date: Used for ending the assignment of the Permission Set at a specified date.
5. If applicable, assign more Permission Sets.
Note: After changes in the Field Security setting a user has to login again to retrieve the appropriate new settings.
Note: Multiple (in)active configurations of field security on the same table for a user account can cancel each other out. The software is designed to work with the combination of all permissions given.
1. Open the window Dataset Security by selecting Departments - Compliance - Security - Field and Dataset Security - Dataset security. The dataset securities per table can be managed in this window (insert, modify or delete).
2. Insert a new dataset security for a table by pressing Ctrl+N.
Note: Only the Security Officer is entitled to create new dataset securities.
3. Complete the following fields in the header:
No.: Automatic and / or manual, based on the number series settings.
Description: E.g. Dataset Security Table 27 Item.
Table ID: Select the ID of the table that you want to split up in datasets. Blocked tables in the Table Categories cannot be used for dataset security. If you select a table that is not categorized you will receive a message. Click Yes if you are sure that you want to setup the dataset security for this table.
4. Data Owner: Select the Permission Set (formerly user role) of the Data Owner of the table:
The Data Owner is entitled to manage the dataset security for the table.
The dataset security settings for the table are not applicable for the Data Owner.
Only the Security Officer is entitled to assign a Data Owner to a dataset security or to create a new dataset security.
5. Start Date: Enter the date from which the dataset security applies (required).
6. End Date: Enter the date until which the dataset security applies (optional).
Note: More than one Secured Dataset may apply to a user. If more than one line is used on visibility, only the first valid line will be applied.
The default checks, defined in Setup module Field and Dataset Security are copied when inserting a new dataset security.
Modifying the checks in the header applies also modifies the checks in the linked permission sets. Deviating checks in the underlying linked permission sets are marked in the field Deviating Trigger Checks.
In the sub window of the window Dataset Security, you can split up the table into datasets. Datasets are defined with one or two filter fields with which the permissions of users are restricted.
Note: If you are using the visibility option in the dataset security settings, some modifications to the applicable forms / pages are necessary in Dynamics NAV 2015 and older.
1. Add the number of lines equal to the number of datasets the table has to be split up.
2. Complete the fields per line:
Dataset Type: select whether the dataset is defined for editability or visibility. Visibility is only applicable for customized forms / pages.
Filter Field 1 (and 2): the datasets can be defined by one or two fields of the table. Select the field that is used for filtering.
Filter 1 Type (and 2): select Permission Set (formerly User Role) or User:
Permission Set: the value of Filter 1 (and 2) and the Assigned Permission Sets are used for granting permissions to the dataset.
User: the permission to the dataset is based on user specific filter values that are defined with the User Filter functionality of the module (see Setup User Filters).
Filter 1 Code (and 2): Applicable if Filter Type = User, refers to the User Filter that is used to get the user specific filter value.
Filter 1 (and 2): Applicable if Filter Type = Permission Set, enter the filter value that is applicable for the dataset line.
For option fields, e.g. document type and type in table Purchase Line, you can retrieve the available options by entering 0|1|2|3|4|5|6|7|8|9. The software replaces this with the options from the list if you leave the field. Remove any option you do not require.
The field type is visible when selecting a field.
Filter And Or: Defines whether the value of Filter 1 and / or Filter 2 is applicable for the dataset. If the Dataset Type is Visible, then only and is allowed.
No. of Assigned Permission Sets: Applicable if Filter Type = Permission Set. This field shows the number of roles that are linked to the dataset security.
Deviating Trigger Checks: Marked if checks differs from checks selected in header.
Assign, after defining a dataset line with Filter Type = Permission Set, the Permission Set(s) (formerly user roles) for which the dataset is applicable. Dataset lines with Filter Type = User do not use the assigned Permission Sets.
1. Open window Dataset Security.
2. Select the line of the defined dataset.
3. Open Permission Sets per Secured Dataset by selecting Actions (Alt+F10), Line, Assigned Permission Sets (Ctrl+F7 or ).
4. Click on dropdown icon in the field Permission Set-ID to select the Permission Set for which the dataset is applicable.
5. Complete (optional) the assigned Permission Set with the field:
Company: Add a company name to set company specific security. If it is empty the security is for all companies.
Starting Date: Used for starting the assignment of the Permission Set at a specified date.
Ending Date: Used for ending the assignment of the Permission Set at a specified date.
6. If applicable, assign more Permission Sets.
See Setup User Filters for more information about the setup and maintenance of user specific filter values.
Note: After changes in the Dataset Security setting a user has to login again to retrieve the appropriate new settings.
This paragraph describes the necessary page customizations that will be used to monitor the invisibility and non-editability. To execute these customizations, you need to have the Application Builder license. We recommend you to add, test and accept the customizations in a test environment.
Note!: filtering for visibility works only when default editable is unchecked (false) on the dataset security.
Using dataset security you can filter lines from a table presented on a page for default NAV-tables using eventing. This can be extended to any table desired by adding the pages on which custom visibility is necessary to code unit 2C Page Event Subscribers (id 11112024). To add a page, a function OnOpen(page name) coupled to the code unit. Configuration and code can be duplicated from functions for other pages. After any change the code unit needs to be compiled. Furthermore, the dataset security needs to be set up with dataset type Visible as if for regular NAV-objects and the NAV-client needs a restart before a change in this configuration works.
Example:
LOCAL [EventSubscriber] OnOpenPageLocationList(VAR Rec : Record Location)
Rec.FILTERGROUP(200);
RecRef.GETTABLE(Rec);
FormSecurity.SetFormFilters(RecRef);
Rec.SETVIEW(RecRef.GETVIEW(FALSE));
Rec.FILTERGROUP(0);
Field security has the ability to hide values from certain fields for users, i.e. present an empty field instead of the value it actually has. This does not work with eventing, but requires the same modification of objects as for Dynamics NAV 2015 and older.
For versions of NAV without eventing and (at the moment) editability of fields in any version, customizations are necessary.
There are two options to customize pages:
Transformation from form to page: for the correct adjustments in the form see Customization for Field and Dataset Security (Forms).
Customize the page manually: follow the instruction below.
1. Determine the page(s) and field(s) that need to be customized. Keep in mind that the number of customized pages should be minimized to avoid conversion issues during upgrade of your software.
Keep in mind that you might need to modify multiple lists and cards which present the same information.
2. Open the relevant Page in the Page Designer (Ctrl+F2).
3. Add a global variable (type Boolean) for each field you want to control:
For example: Vendor No.Editable
<Fieldname>.Editable;
<Fieldname>.Visible;
<Fieldname>.Enabled;
<Fieldname>.HideValue.
4. Set the attribute IncludeInDataset of the above variables to Yes.
5. Set in the trigger OnInit these Booleans on TRUE: For example: “Vendor No.Editable”:= TRUE ;
6. Assign the new variables to the properties Visible, Enabled, Editable and HideValue for each field you want to manage editabiliy / visibility of.
7. Add a new function called “SetFormSecurity”.
Create the Parameter:
“OnOpenForm”
DataType = Boolean
Create the Local Variables:
“RecRef”
DataType = RecordRef
“FormSecurity”
DataType Codeunit
Subtype = “2C Form Security”
Follow the instructions below if you want to add the possibility to make fields non-editable and not visible with Field Security.
Note that you have to add the code per field.
1. Add the code below to the newly created function SetFormSecurity, based on this example on the field Vendor No. on Page 30 Item for the RTC.
Note! Make sure that the EDITABLE property of a field is always set before the VISIBLE property.
Visibility for Dataset Security
To add the code to use the visibility option in Dataset Security follow the instruction below:
1. Add at the end of the function “SetFormSecurity” the following code:
1. Add at the end of the trigger OnOpenPage the code:
2. Optional: if in the trigger OnAfterGetRecord existing code is adjusting the visible and editable properties of the protected fields or one of the filter fields. Add at the end of the trigger the code
Note! Calling the function SetFormSecurity () from this trigger can have negative effects on performance. If this is disturbing, the real necessity of calling this function from the trigger should be reconsidered (and possibly removed) or you can opt for a “second” smaller SetFormSecurity () function in which only the relevant fields are checked.
3. Update the section Documentation by adding the following:.
4. Compile and test the page.
5. Check the modification after setting up the secured fields.
This paragraph describes the necessary page and form customizations that will be used to monitor the invisibility and non-editability. To execute these customizations, you need to have the Application Builder license. We recommend you to add, test and accept the customizations in a test environment.
1. Determine the form(s) and field(s) that need to be customized. Keep in mind that the number of customized forms should be minimized to avoid conversion issues during upgrade of your software.
2. Open the relevant Form in the Form Designer (Ctrl+F2).
3. Add a new function called SetFormSecurity.
Create the Parameter:
“OnOpenForm”
DataType = Boolean
Create the Local Variables:
“RecRef”
DataType = Record Ref
“FormSecurity”
DataType Codeunit
Subtype = “2C Form Security”
Follow the instructions below if you want to add the possibility to make fields non-editable and not visible with Field Security.
Note: You have to add the code per field.
1. The example below is applicable to the field Credit Limit in form 21 Customer for the Classic client.
Note! Make sure that the EDITABLE property of a field is always set before the VISIBLE property.
Note! Use the correct transformation rules when transforming a form to a page and make sure that code per field consists of two lines. The second line starts after the :=.
To add the code to use the visibility option in Dataset Security follow the instruction below:
1. Add at the end of the function “SetFormSecurity” the following code:
1. Add at the end of the trigger OnOpenForm the code:
2. Optional: if in the trigger OnAfterGetRecord existing code is adjusting the visible and editable properties of the protected fields or one of the filter fields. Add at the end of the trigger the code
Attention! Calling the function SetFormSecurity () from this trigger can have negative effects on performance. If this is disturbing, the real necessity of calling this function from the trigger should be reconsidered (and possibly removed) or you can opt for a “second” smaller SetFormSecurity () function in which only the relevant fields are checked.
3. Update the section Documentation by adding the following:
4. Compile the adjusted Form(s) in the Object Designer.
5. Check if the adjustments are functioning correctly.
In the module Authorization Management Pro authorizations are assigned to employees based on the organization structure. To enter the organization structure in the module Authorization Management Pro you have to define the following:
After defining the dimensions and the allowed combinations you can enter the organization structure. Go to Departments > Compliance > Security > Authorization management Pro > Organization Structure.
You can use the arrow-buttons on the window to indent or move the lines. Click Edit list to edit the organization structure.
Select a line in the organization structure and open the by clicking Dimension Value Card under navigate.
Here you can enter more details of the organization dimension value.
Authorization Templates are used to flexibly setup user related tables in Dynamics NAV, including add-on's / customization and to define how you want the records to be filled. There are two kinds of authorization templates. A master template is linked to a normal template. The master template can be used to assign the generic fields (for all users), including the key fields of the table. The specific fields for some users can be assigned by using normal templates.
To create an Authorization Templates go to Departments > Compliance > Security > Authorization Management Pro. Click on Authorization Templates.
Here you find a overview of the Authorization Templates. To create a new template click New.
Setup the following fields in the header:
Code: Automatic and / or manual, based on the number series settings.
Description: Enter a clear description of the template.
Validate Table Trigger: Uncheck this field if you want to disable the validation checks at input.
Table ID: Select the ID of the table that you want add records to by using the template.
In the sub window of the window Authorization Template you can select the fields to be filled. You can select the following sources for a value of the field:
Value: Enter a fixed value.
Multiple Values: Enter multiple fixed values by using Actions (Alt+F10), Line, Linked Values or function keys Ctrl+F7.
Formula: Use a date formula.
Employee Authorization Header: Select a field of the employee authorization header.
Organization Dimension Value: Use the organization dimension value of the authorization request line.
No. Series: If the field Key Field of the line is marked, you can choose No. Series to use the number series of the table you have selected in the header.
Key Definition: Select a key definition. To define key definitions see below.
The structure of the master template is the same as above. Note: When setting up a master template, the key of this template is leading. If there are already templates set up for the same table, you have to remove the key fields from these templates first.
You can define key definitions based on (parts of) fields of the authorization request header. To define the key definition go to Departments > Compliance > Security > Authorization Management Pro. Click Key Definitions.
Organization roles are based on the organization structure and to setup the structure dimensions needs to be defined.
To create an Organization Dimension go to Departments > Compliance > Security > Authorization Management Pro. Click on Organization Dimension under organization Structure.
Here you can enter the Organization Dimensions. Choose the classification to describe your organization. For example departments and functions.
Open the window Organization Dim. Combinations by selecting menu Departments > Compliance > Security > Authorization Management Pro to enter the allowed child of a parent.
After defining the dimensions and the allowed combinations you can enter the organization structure.
To use authorization requests:
1. Open the window Authorization Request by selecting Departments - Compliance - Security - Authorization Management Pro - Authorization Requests. In this window you can manage authorization requests for users.
2. Complete the following fields in the header.
User ID: Select the user (if available).
Name: Enter the full name of the employee. If you have selected a user ID this field is filled automatically.
Employee No.: Optional: create an employee (by using Actions, Employee) if you want to use this functionality.
3. Under the heading Authorization Request Subform, fill in the line(s) with, among other things, the Dimension Value that must be assigned to the user.
4. Click on 'Process Request' at the top.
Navigate to Departments - Compliance - Security - Authorization Management Pro- Processed Authorization Requests to view the History Of Processed Authorization Requests.
In this overview you can inspect each Processed Authorization Request. To view specific request, you can use the standard filter function.
This screen contains the following columns:
No.: Number of the Authorization Request
Type: The type which contains the Authorization Request. This can be New Employee, Mutation, Leaving Employee, Job replacement
Name: The name of the user within the Authorization Request
Employee No.: The Employee number of the user within the Authorization Request
User ID: The User ID of the user within the Authorization Request
Status: Shows the status of the Authorization Request.
Last Modified By: This user created the Authorization Request.
Last Date modified: This is the last day the Authorization Request was modified.
Company (Group): This is an optional field. it shows the company (group) within the Authorization Request.
To configure default settings of the module Authorization Management Pro:
1. Open the window Authorization Management Pro Setup by selecting menu Departments - Compliance - Security - Authorization Management Pro - Setup - Authorization Management Pro Setup.
2. The FastTab General contains the following fields:
Authorization Dimension: The dimension on which authorizations are granted. A line for this dimension is added to the request by default.
Dimension 1 Code: Used on the header of Organization Dimension Value.
Dimension 2 Code: Used on the header of Organization Dimension Value.
Process With Job Scheduler: Mark this field to automatically process non-processed authorizations requests and / or employee authorizations.For explanation of the Job Scheduler see Work with Job Scheduler.
Automatic Process Auth. Req.: Mark this field to automatically process non-processed authorization requests. Only requests with status Released, and if approval is used status Approved, will be processed.
Last Update: shows the date and time of the last update. * Use Approva: check this field to use the approval step in processing authorization requests.
Note that from Dynamics NAV 2016 and up the default workflow functionality is used. Approval of authorization requests requires setup in the Dynamics NAV workflow configuration. The fields User Approval and Approval Officer are not present in the Authorization Management Pro setup in those versions.
Approval Officer: Select the role with permissions to approve requests in the field Approval Officer. Only users linked to this role are allowed to approve requests in the Authorization Request window.
Employee No. Key Definition: Select the appropriate key definition to generate new employee numbers.
No. of Bold Levels Org. Structure: this field contains the number of levels to be shown bold in the organization structure.
3. Assign on the FastTab Numbering the number ranges used for:
Authorizations Request Nos.
Employee Authorization Nos.
Authorization Template Nos.
Master Template Nos.
You can use these number ranges to assign numbers automatically.
The window Employee Authorization shows on FastTab Integration employee-related fields. Besides these fields (of standard Dynamics NAV tables) you can define 3 fields to be shown on the employee authorizations. Select on FastTab Flexible Key Fields the tables and fields to be used.
FastTab Synchronization shows the date and time of the previous synchronization of the standard Dynamics NAV user tables.
In the module Authorization Management Pro authorizations are assigned to employees based on the organization structure. To enter the organization structure in the module Authorization Management Pro you have to define the following:
1. Open the window Organization Dimensionsselecting menu Departments - Compliance - Security - Authorization Management Pro - Organization Structure - Organization Dimensions. Here you can enter the Organization Dimensions. Choose the classification to describe your organization.For example departments and functions.
2. Open the window Organization Dim. Combinationsby selecting menu Departments - Compliance - Security - Authorization Management Pro - Organization Structure - Organization Dimensions Combinationsto enter the allowed child of a parent.
After defining the dimensions and the allowed combinations you can enter the organization structure.
Open the window Organization Structure by selecting menu Departments - Compliance - Security - Authorization Management Pro - Organization Structure - Organization Structure. Use the arrow-buttons on the window to indent or move the lines.
Authorizations are granted based on selecting the appropriate organization dimension value (e.g. function) for the employee. You can assign authorization actions to organization dimension values.
1. Open the window Organization Dim. Value by selecting menu Departments - Compliance - Security - Authorization Management Pro - Organization Structure - Organization Structure.
2. Select the line and open the card by (Related Information, Dimension, Dimension Value Card).
3. In the sub window of the window Organization Dim. Value you can select the actions to be performed.
4. Select the type of action. There are 3 types of actions:
Template: authorization template or master authorization template (as defined in Authorization Actions).
Batch: batch action (see Authorization Actions).
Organizational Role(formerly user profiles):grouped permission sets (formerly user roles) as defined in the 2-Controlware module Authorization Management (see Organizational Roles).
To enter a comment-line for readability leave the type blank.
5. You can define the action at the following moments:
Execute for Starting Date
Execute for Ending Date
Modify Record(s): check this field to modify the existing record at ending date.Otherwise the record will be deleted.
Execute for Replacement
Modify Replacement Record(s): check this field to modify the existing record at ending date of the replacement.Otherwise the record will be deleted.
6. In the header of the organization dimension value you can optionally select a default company or company group to limit the execution of the templates and profiles to those companies. The company(group) can always be changed at the authorization request lines.
7. The option Copy Menu From User ID allows you to copy the Navigation Pane of a certain key user in the Classic client.
Authorizations are granted based on selecting the appropriate organization dimension value (e.g. function) for the employee. You can assign authorization actions to organization dimension values.
1. Open the window Organization Dim. Value by selecting menu Departments - Compliance - Security - Authorization Management Pro - Organization Structure - Organization Structure.
2. Select the line and open the card by (Related Information, Dimension, Dimension Value Card).
3. In the sub window of the window Organization Dim. Value you can select the actions to be performed.
4. Select the type of action. There are 3 types of actions:
Template: authorization template or master authorization template (as defined in Authorization Actions).
Batch: batch action (see Authorization Actions).
Organizational Role(formerly user profiles):grouped permission sets (formerly user roles) as defined in the 2-Controlware module Authorization Management (see Organizational Roles).
To enter a comment-line for readability leave the type blank.
5. You can define the action at the following moments:
Execute for Starting Date
Execute for Ending Date
Modify Record(s): check this field to modify the existing record at ending date.Otherwise the record will be deleted.
Execute for Replacement
Modify Replacement Record(s): check this field to modify the existing record at ending date of the replacement.Otherwise the record will be deleted.
6. In the header of the organization dimension value you can optionally select a default company or company group to limit the execution of the templates and profiles to those companies. The company(group) can always be changed at the authorization request lines.
7. The option Copy Menu From User ID allows you to copy the Navigation Pane of a certain key user in the Classic client.
Besides the authorization actions based on the organization dimension value there are often actions to be performed for every authorization request. These are called pre and post actions. Pre actions are executed before the request, post actions are executed after the request.
Open the window Pre and Post Actions by selecting Departments - Compliance - Security - Authorization Management Pro - Authorization Actions - Pre and Post Actions.
In addition to the fields as described in Authorization Actions per Organization Dimension Value you have to indicate per line the running order (pre or post).
Batch actions are used to execute reports or codeunits. You can use these to perform functions of add-ons. You can link the employee authorization header or line to the report or codeunit.
In this chapter you can read about creating, processing en reviewing Authorization Requests.
In the window Authorization Request of the module Authorization Management Pro you can create and manage authorization requests.
You define in the authorization request for which place of the organization structure the new user should be authorized. At processing all the linked actions are automatically executed.
After the initial setup, authorizing new users requires almost no manual input. You define in the authorization request for which place of the organization structure the new user should be authorized.At processing all the linked actions are automatically executed.
The module Authorization Management Pro offers approval functionality for e.g. managers. If you have enabled Approval in setup (see Authorization Management Pro Setup) the flow is as follows: Open > Released > Approved > Processed. Otherwise released requests can be processed directly.
Open the window Authorization Request by selecting Departments - Compliance - Security - Authorization Management Pro - Authorization Requests.
Use the Functions to proceed to a next status.
For automatic processing of authorization requests see Authorization Management Pro Setup.
All processed authorization requests are archived and can be viewed.
Open the window Processed Authorization Request by selecting Departments - Compliance - Security - Authorization Management Pro - Processed Authorization Requests.
The window Employee Authorization shows the current authorizations of employees.
Open the window Employee Authorization by selecting Departments - Compliance - Security - Authorization Management Pro - Employee Authorizations.
To update the authorizations if you have changed authorization actions in the window Organization Dimension Value: select in Organization Dim. Value Actions, Functions, Update Employee Authorizations.
In this chapter you can read about (Master) Authorization Templates and Key definitions.
Authorization Templates are used to flexibly setup user related tables in Dynamics NAV, including add-on's / customization and to define how you want the records to be filled. There are two kinds of authorization templates. A master template is linked to a normal template. The master template can be used to assign the generic fields (for all users), including the key fields of the table. The specific fields for some users can then be assigned by using normal templates.
1. Open the window Authorization Templateby selecting menu Departments - Compliance - Security - Authorization Management Pro - Authorization Actions - Authorization Templates.
2. Setup the following fields in the header:
Code: Automatic and / or manual, based on the number series settings.
Description: Enter a clear description of the template.
Validate Table Trigger: Uncheck this field if you want to disable the validation checks at input.
Table ID: Select the ID of the table that you want add records to by using the template.
3. In the sub window of the window Authorization Template you can select the fields to be filled. You can select the following sources for a value of the field:
Value: Enter a fixed value.
Multiple Values: Enter multiple fixed values by using Actions (Alt+F10), Line, Linked Values or function keys Ctrl+F7.
Formula: Use a date formula.
Employee Authorization Header: Select a field of the employee authorization header.
Organization Dimension Value: Use the organization dimension value of the authorization request line.
No. Series: If the field Key Field of the line is marked, you can choose No. Series to use the number series of the table you have selected in the header.
Key Definition: Select a key definition. To define key definitions see below.
The structure of the master template is the same as above.
Note! When setting up a master template, the key of this template is leading. If there are already templates set up for the same table, you have to remove the key fields from these templates.
You can define key definitions based on (parts of) fields of the authorization request header. To define the key definition open the window Key Definitionby selecting menu Departments - Compliance - Security - Authorization Management Pro - Authorization Actions - Key Definitions.
Standard Dynamics NAV does not offer functionality to ensure users fill in the data completely (mandatory fields). With our Mandatory Fields module, you can set up mandatory fields per table. When users modify a record in the table the module checks if all fields are filled correctly, ensuring the quality of your data is of the standard you desire.
You can read the following about Mandatory Fields in our documentation:
Setup Mandatory Fields: Basic settings for the use of Mandatory Fields.
Work with Mandatory Field: Setting up a Mandatory Field.
How to Setup a Mandatory Field: Walkthrough.
How to Check Mandatory Fields preventive: Walkthrough.
Subsets of table
By using a filter you can make the mandatory field definition only applicable for a subset of records within the table.
Error message or warning
Mandatory Fields can be setup in different varieties. You can set up mandatory fields as a warning, which means the user receives a warning if not all fields are filled, but the user can still save the record. The other variant is the error message, with which the user receives an error message and cannot save the record until all mandatory fields are filled correctly.
Conditional checks
You can also add a condition to the check moment. This variant allows you to modify the validation field to a certain value. Conditional checks can be combined with error messages or warnings.
Analyze the quality of your data
The above mentioned functionality considers all preventive checks. However, organizations also want to be able to check their current data. The Mandatory Fields module offers also a detective functionality, with which you can easily check if the set up mandatory fields are correctly filled on all current records.
To make use of Field & Dataset Security you need the folllowing permissions.
In the window Mandatory Field Setup you define the setup to work correctly with the module Mandatory Fields.
Open the window Mandatory Field Setup.
Enter on the FastTab General the Permission Set (formerly user role) that represents the “Security Officer” of the company. The Security Officer is authorized to create and link data owners to new mandatory field per table.
Assign on the FastTab Numbering the number ranges used for mandatory fields. You can use these number ranges to assign numbers to the mandatory fields cards.
For avoiding performance issues, the module Mandatory Fields should not be used for all tables. In the Table Categories all standard tables of Dynamics NAV are categorized and if applicable blocked for mandatory fields. An import file with the default Dynamics NAV tables is provided by your Dynamics NAV partner. The import file also defines some of the linked forms to be used in the window Mandatory Fields Check Overview.
This procedure explains how you can import and define table categories.
Follow the steps below to define table categories:
Open the window Table Categories by selecting Departments - Compliance - Security - Field and Dataset Security - Setup - Table Categories.
Import the table categories by selecting Actions, Functions, Fill Table Categories.
Press Ctrl+N to create a table category.
Select in the field Table ID the table you want to add. The field Table Name is filled automatically.
Check the field Blocked if this table should not be used in field and dataset securities or for mandatory fields.
Enter in the field Module to which Dynamics NAV module the table applies.
Select the field Table Type the table type that applies to the table.
Use Related Information, Line, Linked Forms to modify the linked forms to be used in the Mandatory Fields Check Overview. Select 0 in Filter Field No. to define a form for all fields of a table.Or use a filter field and filter value.
Note! This table is also used for the module Field and Dataset Security. The option Departments - Line, Linked Formsis not applicable for the module Field and Dataset Security.
The Mandatory Fields module offers also a detective functionality, with which you can easily check if the set up mandatory fields are correctly filled on all current records (existing data).
1. Open the window Mandatory Fields and select Actions, Functions, Check Table. This function calculates the window Mandatory Fields Check Overview (with a mandatory fields filter).
The calculation can also be executed directly from the window Mandatory Fields Check Overview (no filter).
Notice that the field Calculated is checked. It shows whether the information in the window is up to date or not. To navigate in the overview (expand/collapse) right-click on the indentation mark.
Standard Dynamics NAV does not offer functionality to ensure they fill in the data completely (mandatory fields). With our Mandatory Fields module, you can set up mandatory fields per table. When users modify a record in the table the module checks if all fields are filled correctly, ensuring the quality of your data is of the standard you desire.
This procedure describes how mandatory fields per table can be managed with the module Mandatory Fields.
Follow the steps below to create a new mandatory field card:
1. Open the window Mandatory Fields. The mandatory fields per table can be managed in this window (insert, modify or delete).
2. Insert a mandatory field card for a table by pressing Ctrl+N. Note! Only the Security Officer is entitled to create a card.
3. Complete the following fields in the header.
No.: Automatic and / or manual, based on the number series settings.
Description: Enter a reason for choosing the fields.
Table ID: Select the ID of the table that you want to monitor. Blocked tables in the Table Categories cannot be used for mandatory fields. If you select a table that is not categorized you will receive a message.
Click Yes if you are sure that you want to setup mandatory fields for this table.
Filter Field No. and Filter Field Value. : You can make the mandatoy field definition applicable to a subset of the table. Select the field to filter out the records the mandatory field definition applies to and enter the filter value. Enter field number 0 to not use the filter option.
Validation Type: The validation can be preventive:
Error: display error message and rollback the changes.
Warning: display warning message, change value of validation field to the indicated value (and store the changes).
Detective: analyze which records do not apply to the defined specifications by using Actions, Functions, Check Table. Or No message: display no message if the field contents are modified by the module. This way, values can be modified automatically without interrupting the user.
4. Data Owner: Select the Permission Set (formerly user role) of the Data Owner of the table. The Data Owner is entitled to manage the mandatory fields for the table. The mandatory fields for the table are not applicable for the Data Owner. Only the Security Officer is entitled to assign a Data Owner to a mandatory field card or to create a one.
5. Company: If this field is blank, the mandatory fields definition applies to all companies. If a company is selected the definition only applies to that company.
6. Valid for Permission Set ID: If this field is blank, the definition applies to all users. If a permission set is selected the definition only applies to users linked to that permission set.
7. Start Date: Enter the date from which the mandatory fields apply (required).
8. End Date: Enter the date until which the mandatory fields apply (optional).
Select Checks
On FastTab Triggers you can define the moment to check for mandatory fields: at insertion and / or modification of a record.
In the sub window of the window Mandatory Fields, fields of the table can be selected to be mandatory fields.
Add a line per field and add the conditions to check on. Optionally enter a description for the line.
The following checks are available
Not Blank
Minimum and Maximum Length
Minimum and Maximum Value
Equal To
Not Equal To
The conditions depend on the field type. The check Not Blank is availble for all field types. The other conditions are available for:
Field Type | Length | Value | (Not) Equal To |
Code | x | x | |
Text | x | x | |
Integer | x | ||
Decimal | x |
Note! The check (Not) Equal To is treated as a filter.
1. Open the window Mandatory Fields and open de list Mandatory Fields. Click on New to create a new Mandatory field.
Complete the nessecary General fields. In this example we are creating a mandatory regarding the vendorcard. the validation type return an error and rollsback the complete card.
Note! Don't forget the field “Valid for permission Set ID”. If you leave this blank, the definition applies to all users, instead it only applies to users with the specific Permission Set.
After the General informatie, we need to define the fields. In this example it is required to fill the following fields: Name, Address, City, Phone No. and VAT Registration No with 11 charactars.
After defining the lines, we have to fill the last 2 fasttabs. In this example we are not using the Conditional tab but only the Triggers tab. The Triggers tab defines the moment to check the mandatory fields.
You can add a condition to the check moment. This variant allows you to save the changes and modify the validation field to a certain value. Conditional checks can be combined with error messages (rollback of modifications) or warnings.
The following scenarios may occur:
Preventive (on input) AND Detective (analyze)
Nr. | Validation Type | Conditional / Modify Validation Field | Rollback |
1. | Error | None | Yes |
2. | Error | Conditional | Yes |
3. | Warning | None | No |
4. | Warning | Conditional | No |
5. | Warning | Conditional + Modify | No |
Detective (analyze)
Nr. | Validation Type | Conditional / Modify Validation Field | Rollback |
6. | Detective | None | - |
7. | Detective | Conditional | - |
If the objective is to control the input, scenarios 1,2 and 5 are the most logical set up. Scenarios 3 and 4 only show a warning (optionally conditional), but do not stop the user.
Example of scenario 5: If a Customer card is blocked for All(transactions), you allow the user to save an incomplete record. But when the field Blocked is modified to unblocked, you want the check on mandatory fields executed. If the field Blocked is modified to unblocked and not all mandatory fields are filled correctly, the field Blocked is modified back to All (transactions). In this way, a Customer cannot be unblocked before all fields are correctly filled.
Note! If a validation field is modified by a function in Dynamics NAV (e.g. status fields), only validation type error will work. A warning with saving modifications is not possible, a full rollback will occur.
Cost Allocation (formerly Permanence) is a powerful financial solution that is fully integrated within Dynamics NAV. With Cost Allocation, costs can be automatically processed over multiples periods by using the recurring journal. Examples of such costs are annual and quarterly subscriptions, software licensing and hardware maintenance costs, costs of cleaning and maintenance of buildings, subscriptions to professional literature, insurance, energy, etc. In other words, all costs over multiple periods can be distributed in flexible manner.
Open window Cost Allocation Setup by selecting Departments - Compliance - Control - Cost Allocation - Setup - Cost Allocation Setup.
Fill these fields
1. FastTab General:
Interim Account No.: This field contains the number of the G/L interim account you want to use to post the invoice and the corresponding Cost Allocations.
Recurring Frequency Period: Select ‘Week, Month, Bimonthly, Quarter, Year or Period’.
Number Of Recurring Periods: Enter over how many periods you want to distribute the costs.
Posting Day Within Period: Select ‘First Day, Last Day or Specific Day’. The distributed costs will be posted on the chosen day in the chosen frequency.
Day Within Period: If Posting Day Within Period is ‘Specific Day’ you can enter the day in the period.
Amount Calculation Rule: Select how to divide the cost over the periods: ‘Proportionally, Daily Basis or Manual’.
No. Of Journal Lines: Select ‘One Journal Line’ to use the field Balance Account No. for the contra entry or ‘Two Journal Lines’ to show the contra entry as separate lines.
Use Allocation: Check this field to allocate costs within a period by using Allocation Lines.
Create Cost Allocation from Invoice: Cost Allocations can be created from a template, manually or the user may choose for each time an allocation is created.
No. of Bold Levels Dashboard: The number of levels presented in bold typeface on the dashboard.
2. Assign on the FastTab Numbering the number ranges used for Cost Allocation templates and Cost Allocations.
3. FastTab General Journal:Check the field Create General Journal to automatically create a general journal line. If the field is checked the following fields are relevant:
Post General Journal Line: Select ‘Post or Post and Print’ if you want to post the created General Journal Line automatically when processing the Cost Allocation.
Posting Date: Select the posting date of the General Journal Line: ‘Document Posting Date or Work Date’.
Template Name: Select the general journal template you want to use for Cost Allocations. We advise to create a separate template for Cost Allocations.
Batch Name: Select the batch you want to use.
Document No.: Select Document if you want the field Document No. in the general journal to contain the document number of the invoice. Select Journal Batch to use the number of the batch.
Description: Select Document Line if you want the field Description in the general journal to contain the description of the corresponding document line of the invoice. Select Template to enter the description in the field Description Template.
4. FastTab Recurring Journal:
Post Recurring Journal: Select ‘Post or Post and Print’ if you want to post the Recurring Journal Line(s) valid for the current work date.
Reference Date: This field determines which date to use to calculate the starting date of the Cost Allocation.
Template Name: Select the recurring journal template you want to use for Cost Allocations. We advise to create a separate template for Cost Allocations.
Recurring Journal Batch: Select the batch you want to use.
Document No.: Select ‘Document’ if you want the field Document No. in the recurring journal to contain the document number of the invoice. Select ‘Journal Batch’ to use the number of the batch.
Description: Select ‘Document Line’ if you want the field Description in the general journal to contain the description of the corresponding document line of the invoice. Select ‘Template’ to enter the description in the field Description Template.
Automatically clean Rec. Jnl.: Check this field to automatically clean up expired recurring journal lines.
5. FastTab Posting Groups
6. Here you can enter default values for the following fields if these need to be different from the default general ledger configuration:
Gen. Posting Type
Gen. Bus. Posting Group
Gen. Prod. Posting Group
VAT Bus. Posting Group
VAT Prod. Posting Group
A Cost Allocation template can be used to capture the settings for similar Cost Allocations. The initial settings when creating a template are retrieved from Cost Allocation setup. You can change the values of the fields per template if necessary. For explanation of the fields see the previous paragraph (Cost Allocation Setup).
Follow the steps below to create a Cost Allocation Template
1. Open window Cost Allocation Template by selecting Departments - Compliance - Control - Cost Allocation - Cost Allocation Template and select New (Ctrl+N).
2. On FastTab General:
Name: Enter a suitable short name for the Cost Allocation template. Tip! You can consider to allow manual numbers in the chosen No. Serie to be able to use recognizable names instead of numbers.
Description: Enter a description for the template.
3. On FastTab Dimensions: Here you can define (optional) the dimension values to be used for analysis purposes for the Cost Allocation. At modification of dimensions the changes may be applied to the dimensions in the allocation lines.
Some tips:
If the field Use Allocation is checked you can enter Allocation Lines by selecting Actions (Alt+F10), Line, Allocation Lines. Here you can also enter dimensions per allocation line by selecting Actions (Alt+F10), Line, Dimensions
You can define (optional) the dimension values to be used for analysis purposes for the Cost Allocation template.
Use of dimensions
You can define (optional) the dimension values to be used for analysis purposes for the Cost Allocation template.
If the field Use Allocation is checked you can enter Allocation Lines by selecting Actions (Alt+F10), Line, Allocation Lines. Here you can also enter dimensions per allocation line by selecting Actions (Alt+F10), Line, Dimensions.
There are two ways to create a Cost Allocation:
Create Cost Allocation manually: see Create Cost Allocation Proposal.
Create Cost Allocation directly from Purchase Invoice or Credit Memo. To be able to create Cost Allocations directly from purchase invoices or credit memos your Microsoft NAV Partner needs to customize some lines to some objects. See Customize NAV for 2-Controlware Cost Allocation. After customization you can select an invoice line and select Actions (Alt+F10), Line, Cost Allocation to create a Cost Allocation.
When creating a Cost Allocation you select the appropriate template to capture the settings. You can change the values of the fields for the Cost Allocation if necessary. For explanation of the fields see the previous paragraphs (Cost Allocation Setup and Create Cost Allocation Template).
Some tips:
Based on the settings in the header of the Cost Allocation proposal, the Cost Allocation lines are calculated automatically, but you can still change the lines if necessary. To solve rounding issues an additional line may be created in the last period of the Cost Allocation.
You can define (optional) the dimension values to be used for analysis purposes for the Cost Allocation. At modification of dimensions the changes may be applied to the dimensions in the allocation lines.
If the field Use Allocation is checked you can enter Allocation Lines by selecting Related Information, Line, Allocation Lines. Here you can also enter dimensions per allocation line by selecting Related Information, Line, Dimensions.
The following fields are additional for Cost Allocation template
1. Open window Cost Allocation by selecting Departments - Compliance - Control - Cost Allocation - Cost Allocation and select New (Ctrl+N).
2. FastTab General:
No.: Automatic and / or manual, based on the number series settings.
Description: Enter a description for the Cost Allocation.
Document Type: Select Purchase Invoice or Purchase Credit Memo.
Document No.: Select the appropriate purchase invoice line.
Pre-Assigned No.: This field is automatically filled if the document is not posted.
Post Recurring Journal
1. Select Actions, Functions, Post Recurring Journal.
2. The field Posted of a Cost Allocation line is checked if a line is posted.
Clean Recurring Journal
If the field Automatically clean Rec. Jnl. In the header of a Cost Allocation is not check, you can manually clean the expired Cost Allocations in the recurring journal.
1. Select Actions, Functions, Clean Recurring Journal.
Batch Job
You can use the function Post Batchto post all open Cost Allocation lines of all the processed Cost Allocations
1. Select Actions, Functions, Post Batch.
Set Work Date
When working with the module Cost Allocation, it is important that the work date is set correctly. At posting recurring journal lines all lines with expiration date before working date are posted. Posting of the lines to correct rounding differences are only posted if the working date is equal to the expiration date of the line.
Process Cost Allocations
1. Open window Cost Allocation by selecting Departments - Compliance - Control - Cost Allocation - Cost Allocation.
2. Select Actions, Functions, Process Cost Allocation.
The Status of the Cost Allocation changes to ‘Processed’.
If a Cost Allocation line is posted the field Posted is checked.
Post Recurring Journal
1. Select Actions, Functions, Post Recurring Journal.
2. The field Posted of a Cost Allocation line is checked if a line is posted.
To be able to create Cost Allocations from purchase invoices or credit memos your Microsoft NAV Partner needs to customize some lines in some objects. We recommend to test the customizations in a test environment.
Note: If you also want to post automatic Cost Allocations with batch posting, you have to add the code below to the batch post reports of NAV in the OnAfterGetRecord() function (for instance in report 497 Batch Post Purchase Invoices).
In standard Dynamics NAV the monitoring of the reconciliation of the inventory and the work in progress is very difficult. Once you experience differences, it is almost impossible to get insight into the reconciliation.
The module Inventory Reconciliation provides the ability to monitor the inventory and work in progress, to reconciliate with the general ledger and to analyze differences. The analysis can be done in user-friendly and intuitive windows. Besides the analysis of the inventory accounts the module also can analyze and reconcile the applied accounts (result).
In the window Reconciliation Setup you can define the setup for the module Inventory Reconciliation. Follow the steps below to setup the module Inventory Reconciliation:
Open the window Reconciliation Setup by selecting Departments - Compliance - Control - Reconciliation - Inventory Reconciliation - Setup - Reconciliation Setup.
Enter in the field Inventory Post Code the source code that is used for inventory postings: select the same code as selected for Inventory Post Cost in menu Departments - Financial Management - Setup - Trail Codes - Source Code Setup (most common: English: “INVTPCOST” / Dutch: “VOORWDEBKN”).
Select Actions, Functions, Setup Default Reasons Skipped Value Entry and Setup Default Reasons Skipped G/L Entry. In the FastTabs Skipped Value Entry and Skipped G/L Entry Dynamics NAV automatically fills the default reason codes, that are used for skipping value and G/L Entries.
Only appliciable for Dynamics NAV version 3.x:
Set up parameters for posting of expected costs to G/L.
Open the window Expected Cost Posting to G/L by selecting Compliance - Setup - Expected Cost Posting to G/L.
Enter which value entry range the expected costs were posted to the G/L (usually applicable for all entries from number 1).
Open the window Reconciliation SetupwithDepartments - Compliance- Control - Reconciliation - Inventory Reconciliation - Setup - Reconciliation Setup. Enter in the fieldInventory Post Codethe source code (usually “VOORWDEBKN”). Select with Actions, Functions, Setup Default Reasons Skipped Value EntryandSetup Default Reasons Skipped G/L Entry. In the FastTabsSkipped Value EntryandSkipped G/L Entry, Dynamics NAV automatically fills the default reason codes that are used for skipping value and G/L Entries.
Only applicable for Dynamics NAV version 3.x: Set up parameters for posting of expected costs to G/L. Open the window Post Expected Costs to G/L. Enter to which value entry range the expected costs were posted to the G/L (usually applicable to all entries from number 1).
Create a new reconciliation view. Enter in the field Code a clear code and in the field Name a clear description. Select in the field Type ‘Inventory Reconciliation’ and select in the field Date Compression the desired compression level (bigger compression is quicker build). Select Determine Used Accounts to be able to, besides the reconciliation based on the current posting group setup, also analyse the historical reconciliation. Select Show Progress if you want to see the progress during the calculation (advice in bigger environments is to deselect Show Progress for a better performance).
You can also specify the reconciliation view with Dimensions and Customer, Vendor and Item Attributes. The Customer, Vendor and Item Attributes cannot be used in the analysis windows.
Update the reconciliation view initially with Related Information, Update, All. The initial building can take a while depending on the size of your database.
If there are notifications about non existing accounting periods when you are updating the reconciliation view, you need to create these periods. Afterwards, you can resume updating the reconciliation view by Related Information, Update, From Last Entry . The calculation is moving on from last processed entry before the notification.
If you have skipped value entries during updating of the reconciliation view, you have to analyse and process these skipped entries (if possible). This is possible by the window Reconciliation View and the field No. of Skipped Value Entries. Select the arrow down button to show a list of the skipped value entries. Here you can analyse and process the value entries of the Skipped Value Entry Card.
If the reconciliation is built completely, then you can calculate the analysis in the windows. This is possible by the button Related Information, Update, Analysis. You can also set filters in the calculation. After the calculation you can open the windows Analysis Inventory Accounts or Analysis Applied Accounts. If you select the reconciliation view in the field Reconciliation View Code, the window will fill with data. You also can recalculate the windows Analysis Inventory Accounts and Analysis Applied Accounts with Actions, Functions, Calculate, for example after changing the filters.
With the data in the analysis windows you can analyse the reconciliation between the general ledger and the sub administration of the inventory and applied accounts. Possible differences can be limited to a certain period or posting group combination by the button Actions (Alt+F10), Account, Analysis Per Period and Actions (Alt+F10), Account, Analysis Per Posting Group.
From the main window and the analysis per posting group window you can zoom in to the entries to trace and analyse possible differences to document numbers.
After updating the reconciliation view, you need to calculate the analysis windows to perform the most actual analyses. You can do this by executing Related Information, Update, Analysis. In the window Reconciliation View. Based on this function the windows Analysis Inventory Accounts and Analysis Applied Accounts are updated. You can also add filters for the calculation of the analysis. The function to calculate analyses can also be executed from the windows Analysis Inventory Accounts and Analysis Applied Accounts.
Note! Check whether the analysis has been updated, otherwise update it.
In the window Analysis Applied Accounts you can analyse the reconciliation between the applied accounts and the general ledger accounts based on the reconciliation view.
Follow the steps below to analyse the Applied Accounts:
1. Open the window Analysis Applied Accounts.
2. Select in the field Reconciliation View Code the code that you want to use for analysis.
3. Select in the field Show Fields the type of reconciliation that you want to view. You can analyse based on the historical reconciliation (Used) or based on the current settings (Setup).
4. The window shows per general ledger account the reconciliation between the Applied accounts (the fields (1) Applied Inventory Amount, (2) Applied Sales Amount, (3) Production Variances and (4) Applied Purchase Amount) and the general ledger amount (field G/L Amount). In the field Difference you can see the differences. On FastTab Filters or with the time interval buttons on the bottom of the window you can refine the reconciliation.
5. If the window Analysis Applied Accounts shows differences the first (quick visible) possible cause is that the account is used as inventory account. You can see this in the field Used Inventory Account or Setup Inventory Account (depending on which reconciliation (Show Fields) you are analysing). There are two other possible causes of differences, manual postings or other automatic postings, visible in the next fields:
G/L Manual Entry Amount: the amount of manual postings on the G/L account.
G/L Other Non-manual Amount: automatic, by the system generated postings for which the system cannot find a linked value entry in the sub administration, e.g. compressed entries or entries from another sub administration.
If the amount of the manual postings and the amount of the non-manual postings are equal to the difference, the sub administration and the general ledger are reconciled completely. In other words: “Difference - G/L Manual Entry Amount - G/L Other Non-manual Amount” should always be 0.
6. You can filter on various ways with the FastTabs Filters and Posting Group Filters. Note that when a filter is changed, the analyse should be recalculated. Click on Actions, Functions, Calculate.
7. You can use the field Show Check Columns to make the fields Inventory/WIP Sub Ledger Amount and Check visible. With these fields you can check the reconciliation. The field Inventory/WIP Sub Ledger Amount shows the possible posted inventory related entries to the general ledger accounts. The field Check calculates the difference minus the fields G/L Manual Entry Amount, G/L Other Non-manual Entry Amount and Inventory/WIP Sub Ledger Amount. If the column Check still shows differences, you need to contact your Dynamics NAV partner.
8. You can trace possible differences to a certain period or posting group by analysing the reconciliation of only one G/L account per period or posting group with the windows Analysis by Period, Analysis Production variance by Posting group and Analysis Applied Account by Posting Group. Select Actions (Alt+F10), Account and select Analysis by Period, Analysis Production variance by Posting group or Analysis Applied Account by Posting Group to open the specific windows.
In the window Analysis Inventory Accounts you can analyze the reconciliation between the inventory accounts and the general ledger accounts based on the reconciliation view.
Follow the steps below to analyze the inventory:
1. Open the window Analysis Inventory Accounts.
2.Select in the field Reconciliation View Code the code that you want to use for analysis.
3. Select in the field Show Fields the type of reconciliation that you want to view. You can analyze based on the historical reconciliation (Used) or based on the current settings (Setup).
4. The window shows per general ledger account the reconciliation between the sub administration (the fields Inventory, Inventory (Interim) and WIP Inventory) and the general ledger (field G/L Amount). In the field Difference you can see the differences. On FastTab Filters or with the time interval buttons on the bottom of the window you can refine the reconciliation.
5. If the window Analysis Inventory Accounts shows differences the first (quick visible) possible cause is that the account is also used as applied account. You can see this in the field Used Applied Account or Setup Applied Account (depending on which reconciliation you are showing). There are two other possible causes of differences, manual postings or other automatic postings, visible in the next fields:
G/L Manual Entry Amount: the amount of manual postings on the G/L account
G/L Other Non-manual Amount: automatic, by the system generated postings for which the system cannot find a linked value entry in the sub administration, e.g. compressed entries or postings from another sub administration.
If the amount of the manual postings and the amount of the non-manual postings are equal to the difference, the sub administration and the general ledger are reconciled completely. In other words, “Difference - G/L Manual Entry Amount entries - G/L Other Non-manual Amount entries” should always be 0.
6. You can filter the differences on various ways with the FastTabs Filters and Posting Group Filters. Note that when a filter is changed, the analysis from should be recalculated.Click on Actions, Functions, Calculate.
7. You can use the field Show Check Columns to make the fields Applied Sub Ledger Amount and Check visible. With these fields you can check the reconciliation. The field Applied Sub Ledger Amount shows the possible entries of applied costs on the general ledger accounts. The field Check calculates the difference minus the fields G/L Manual Entry Amount, G/L Other Non-manual Entry Amount and Applied Sub Ledger Amount. If the column Check still shows differences, you need to contact your Dynamics NAV partner.
8. You can trace possible differences to a certain period or posting group by analyzing the reconciliation of only one G/L account per period or posting group with the windows Analysis by Period and Analysis by Posting Group. Select Analysis by Period and Analysis by Posting Group to open the specific windows.
In the window Analysis G/L Entries you can analyse the general ledger entries based on a reconciliation view for example by dimension, source code or user ID.
Follow the steps below to analyse the G/L Entries
1. Open the window Analysis G/L Entries.
2. Select in the field Reconciliation View Code the reconciliation view code that you want to use for the analysis.
3. Select in the field Show as Lines the values that you want to show as lines in the analysis window. For example if you select G/L Account, all G/L Accounts will be shown in the Lines.
4. Select in the field Show as Columns which values can be shown as columns in the analysis window. For example if you select the Source Code, the amount of the source code by G/L Account will be shown in the columns.
5. On the FastTab Filters and by the time interval buttons you can refine the result. On the FastTab Options you can indicate if the result should include Closing Entries and should include a Rounding Factor, for example thousands. Check Show Column Name to show the name (or otherwise the code) of the columns.
Note! This window requires no calculation after changing filters and other parameters.
In the window Skipped Value Entries you can analyse and, if possible, process skipped value entries.
Value entries will be skipped if they belong to an unknown entry (not default Dynamics NAV), for example caused by customization. The module Inventory Reconciliation cannot determine how these entries should be processed automatically in the reconciliation. These entries are logged as a Skipped Entry Post during building or updating. You can define how the module Inventory Reconciliation should process these entries.
Follow the steps below to process Skipped Value Entries:
1. Open the window Reconciliation View.
2. Select the field No. of Skipped Value Entries and open the window Skipped Value Entry Listby using the arrow down button. This window shows a list of all value entries that are skipped during updating the reconciliation view.
3. Select the value entry that you want to process. Select Function key Ctrl+Shift+V. The window Skipped Value Entry shows the skipped value entry and the reason for skipping on the FastTab General.
4. On the FastTab Accounts you can define G/L accounts for processing the skipped value entries. Depending on the entry type of the value entry, only the relevant fields are editable. Select in the fields Inventory Accounts, Inv. Account (Interim), WIP Account, Applied Account or Applied Account (Interim) the applicable G/L accounts. You can only select the G/L Accounts that are linked to the value entry. The right side of the window shows a proposal (based on the current posting group setup) for the accounts that should be filled. This are not necessarily the correct accounts, the setup can be modified.
5. If you have filled all accounts, the fields Inventory Amount, G/L Amount, Applied Amount and G/L Amount determines if the value entry can be processed. The value entry can only be processed if the amounts on the inventory and applied accounts are in balance with the G/L accounts. If so, the field Reprocess is checked automatically and the entry will be processed when the reconciliation view is updated. Skipped value entries can also be processed manually by selecting Related Information, Skipped Value Entry, Reprocess Skipped Value Entries.
6. If you have multiple skipped value entries of the same posting type and posting groups you do not have to define the settings for every single value entry. Select Related Information, Skipped Value Entries, Apply Settings.
In this window you can define a value entry range that applies to the settings. The settings are only applicable to the value entries with the same posting types and posting groups. Enter a value entry range and click on OK.
7. After setting up all skipped value entries you can process them by updating the reconciliation view or select Related Information, Skipped Value Entries, Process Skipped Value Entries. If there are value entries that cannot be processed, please contact your Dynamics NAV partner. These entries are always caused by customizations or bugs.
A reconciliation view is used to setup customer specific parameters for building and analysing of the Inventory valuation and reconciliation of sub ledgers with the general ledger. In the window Reconciliation View you can define the template with which the reconciliation between the sub administration and the general ledger can be analysed. The reconciliation view is comparable with the standard Dynamics NAV Analysis View functionality.
Follow the setps below top create a Reconciliation View:
Open the window Reconciliation View by selecting Departments - Compliance - Control - Reconciliation - Inventory Reconciliation - Reconciliation View.
Press Ctrl+N to create a new reconciliation view.
Fill the fields on the FastTabs:
FastTab General
Code: Enter a suitable short name for the reconciliation view. Preferably base the name on the setup handled further this instruction.A possible code to use is RECONCILIATION (Dutch: AANSLUITING).
Type: Select the type Inventory Reconciliation.
Name: Enter a clear description for your reconciliation view.
Select the smallest period to which the inventory data should be reconciled in the field Date Compression. You can choose Day, Week, Month, Quarter, Year or Period. The entries will be compressed with the selected level. Mostly the inventory will be reconciliated by month or accounting period.
Starting Date and Ending Data: These fields are only applicable for type Inventory Analysis.
Determine Used Accounts: Check this field if you want to perform the reconciliation based on historical postings (how is really posted). Do not check this field if you only want to analyse the reconciliation based on the current setup.
Blocked: Check this field if a reconciliation view should not be updated anymore.
FastTab Update
Show Progress: Check this box if you want to see the progress during the update. * Attention: this can decrease performance.
Automatic Calc Analysis: If this box is checked, Dynamics NAV automatically calculates the data with the last set filters after updating the reconciliation view.
Process with Job Scheduler: With this option it is possible to automatically update the reconciliation view with the job scheduler with Codeunit 11111990.
FastTab Dimensions
Define (optional) up to 4 dimensions that you want to use for the set reconciliation posts.
FastTab Attributes
These attributes are used for report purposes. Define on this FastTab customer, supplier and item attributes. The attributes set in this FastTab are only available in the Value Reconciliation Entries.
To analyse the reconciliation between the sub administration and the general ledger, the reconciliation view has to be built completely the first time you want to analyse. After this you can update the reconciliation with the latest entries to be able to analyse the most actual data.
Note!
Depending on the size of your database, the initial update of a reconciliation view can take a long time.
There is a timing difference in updating: the G/L Entry is updated first and the Value Entry second. Posting documents during building the reconciliation yields values in the sub administration which are not present in the G/L. This cannot be solved in environments with continuously posting. Best practice is to analyse only for already closed periods (..P-1).
1. Open the window Reconciliation View.
2. Select Related Information, Update and select All. The reconciliation view is build. If you checked the option Show Progress you can see the progress during the calculation. When the updating is finished, you can analyse the reconciliation in the windows Analysis Inventory Accounts and Analysis Applied Accounts.
3. The reconciliation view contains an amount of (non-editable) information field.
Last Date Updated VREand Last Date Updated GRE: the date of the latest update of the reconciliation view.
Last Value Entry No.and Last G/L Entry No.: the numbers of the last updated entries respectively.
No. of Skipped Value Entriesand No. of Skipped G/L Entries: shows how many entries are skipped respectively. These entries could not be processed by module Inventory Reconciliation, because they were not created according to the standard logic of Dynamics NAV.
Follow the steps below to update the reconciliation view:
1. Open the window Reconciliation View.
2. Select Related Information, Update and select From Last Entry. The reconciliation view will be updated from the last updated entry. If you checked the option ‘Show Progress’ you can view the progress during the calculation. When the updating is finished, you can analyse the reconciliation in the windows Analysis Inventory Accounts and Analysis Applied Accounts.
3. The function Update has the following options:
From Last Entry: Updates the reconciliation view from the last updated entry. With the first update, the reconciliation view will be updated from the first entry.
All: Updates all entries in the reconciliation view. Attention! If reconciliation views are already updated once, the existing entries will be deleted and all entries will be updated again.
Attributes: Updates all entries with the currently set up properties on the FastTab Attributes.
Setup Accounts: Updates all entries with the current posting setup (only applicable if posting setup has changed).
Analyses: Updates the analyses for the windows Analysis Inventory Accounts and Analysis Applied Accounts, based on the added filters.
Note! Depending on the size of your database, the initial update of a reconciliation view can take a long time.
4. The reconciliation view contains an amount of non-editable fields:
Last Date Updated VREand Last Date Updated GRE: are filled with the date of the latest update of the reconciliation view.
Last Value Entry No.and Last G/L Entry No.: are filled with the numbers of the last updated entries respectively.
No. of Skipped Value Entriesand No. of Skipped G/L Entries: shows how many entries are skipped respectively. These entries could not be processed by module Inventory Reconciliation, because they were not created according to the standard logic of Dynamics NAV.
The last version of 2-Controlware for Dynamics NAV 2009 R2 is 7.00. Contact us if you need to download objects for these version or older.
For the customization of Dynamics NAV 2013 up to including 2015 see Merge 2-Controlware code in NAV 2013 up to including 2015. For NAV 2016 and up no merge is required because eventing is used.
For the 2-Control Compliance modules Field and Dataset Security and Mandatory Fields a separate activation is needed.
Your Microsoft NAV Partner needs to customize some lines to Codeunit 1 Application Management and 423 Change Log Management. After these customizations the 2-Controlware modules are ready for setup and use. For the other modules no merge is required.
Important! If you have changed the default setting for the change log in Compliance Setup, you have to call the change log functionality in codeunit 1. See General Compliance Setup.
Important! For the customization of Dynamics NAV 3.xx – 2009 see Merge 2-Controlware code in NAV 3.xx-2009. For NAV 2016 and up no merge is required because eventing is used.
To activate the modules
Field and Dataset Security
Mandatory Fields
Your Microsoft NAV Partner needs to customize some lines to Codeunit 1 Application Management, 40 LogIn Management and 423 Change Log Management. After these customizations the 2-Controlware modules are ready for setup and use.For the other modules no merge is required.
Important! If you have changed default setting for the change log in Compliance Setup, you have to call the change log functionality in codeunit 1. See General Compliance Setup.
To enable multiple add-ons to use Codeunit 1 on NAV 2015 and older, a specific code merge is necessary. This needs to be reviewed per customer or installation as circumstances differ. The code merge as described in our manual is only an example useful if no other add-ons are used. As we use eventing in NAV 2016 and newer, no code merge is necessary. The code merged to Codeunit 1 for 2-Controlware fills a temporary memory table at login, holding the field security configuration for the user. The function GetDatabaseTableTriggerSetup checks whether field security is active by using database triggers. It sets four booleans to determine which triggers are executed:
OnDatabaseInsert;
OnDatabaseModify;
OnDatabaseRename;
OnDatabaseDelete.
2-Controlware can be used in conjunction with other add-ons in Codeunit 1. To do this the code required for the add-on needs to be merged in such way they do not interfere. Our add-on checks the four booleans in one run, however other add-ons might not do so. These are the steps required to combine the 2-Controlware add-on with other add-ons in Codeunit 1:
1. By default the booleans are FALSE;
2. Create four local booleans in the function to store the most recent value, one for every trigger;
3. Execute the code of the first add-on;
4. Copy the values of the default booleans in the local booleans;
5. Check whether at least one of the four is still FALSE:
If not, EXIT: everything is TRUE, there's nothing to check;
If there are, execute the next function and compare the default and the local booleans:
If local = FALSE and default = TRUE, then local becomes TRUE;
If local = TRUE, then nothing;
6. Do this for all add-ons.
These steps sum it up for all add-ons. The basic assumption is that an add-on only converts FALSE to TRUE, never the other way around. Converting from TRUE to FALSE would not be logical because FALSE is the default value. Apart from that, once a boolean is set to TRUE it should remain TRUE for that specific add-on. To recapitulate: changing TRUE to FALSE is not possible. Another risk is that partners might do more than setting booleans in Codeunit 1. In such occasion you have to execute all code and copy the local booleans to default booleans, instead of EXIT. Lastly, 2-Controlware checks the changelog in our own Codeunit by default. You probably do not want this when using multiple add-ons. With recent versions of 2-Controlware you can configure this in the table 2C Compliance Setup. It holds an option to configure where the changelog is checked: Codeunit 2CW or Codeunit 1. You choose the last option there to change this.
If you are merging code of 2-Controlware into NAV 2013 up to and including NAV 2015, without a license for the 2-Controlware modules present (e.g. developer license of another add-on supplier), a license check is required. For NAV 2016 and up eventing is used. Because only for the modules
Field and Dataset Security
Mandatory Fields
a merge is needed, this code will initially only be of interest for Codeunits 1, 40 and 423 (see Merge 2-Controlware code in NAV 3.xx-2009and Merge 2-Controlware code in NAV 2013 up to including 2015). If visibility code is added on Forms or Pages, the license check is also relevant here (see Customization for Field and Dataset Security (Forms)and Customization for Field and Dataset Security (Pages)).
Does user have direct permissions in per source tables? See Required Authorizations.
Field and Dataset Security
Table ID Table Name Permissions
11112031 2C Secured Table Per Source Read, Insert, Modify, Delete
11112032 2C Secured Field Per Source Read, Insert, Modify, Delete
11112033 2C Secured Dataset Per Source Read, Insert, Modify, Delete
Do you have a permission set that is data owner yourself?
Does the simplest version of field security work? (no filter, no incidental filling, only shield the field)
NAV / BC client restart? Only then does the adapted design become active
Correct field (s) on the line (s) linked?
Is the correct set linked to the field on the line?
Correct triggers active when linking set? (triggers on card are not active themselves, are copied to linked sets)
Start date filled with today or earlier and end date empty or in the future?
Is Standard mutable good?
Is a filter active? Is this correct? Note: with a filter you limit the lines of the table on which the functionality is active to the lines that are IN the result set.
Are you testing on the right line?
Is the checkmark Mutable good?
You don't have to change it yourself, but have it controlled by field std can be changed!
Is the filter correct? Optionally test it on a page or if you run the table from dev. Environment or object explorer.
Does user have direct permissions in per source tables?
Mandatory Fields
Table ID Table Name Permissions
11112045 2C Mandatory Table Per Source Read, Insert, Modify, Delete
11112046 2C Mandatory Field Per Source Read, Insert, Modify, Delete
11112051 2C Error Message per User Read, Insert, Modify, Delete
The most basic version of the mandatory field works: no filter, nothing conditional. Just the check mark not empty?
Is a filter active? Is this correct? Note: with a filter you limit the lines of the table on which the functionality is active to the lines that are IN the result set.
With the Field & Dataset security module you can refine authorizations to field and dataset level. For example, you can easily split the permissions for different documents. It is an addition to Authorization Box and enables you the apply ‘waterproof’ segregation of duties in Dynamics NAV.
The Mandatory fields module makes it possible to have fields filled in compulsorily or to let them be filled in using predefined values. This prevents incomplete data and gaps in your internal data processing.
The latest version can be requested from your Microsoft Dynamics Partner or downloaded from our web portal.
Installations of our extensions are updated automatically as we release a new version. For on-premise installations of our extensions we provide updates on our support portal: 2-control.nl/en/webportal/.
Objecttype | Object-id | Objectname | Servicename |
---|---|---|---|
Codeunit | 11112022 | ABWebService | AB |
Page | 357 | Companies | ABCompanies |
Page | 358 | Objects | ABObjects |
Page | 595 | Change Log Entries | ABChangeLog |
Page | 9171 | Profiles | ABProfiles |
Page | 9800 | Users | AB Users |
Page | 11112200 | AB Permission Sets | ABPermissionSets |
Page | 11112201 | AB Permissions | ABPermissions |
Page | 11112202 | AB AccessControl | ABAccessControl |
Page | 11112203 | AB TenantPermissions | ABTenantPermissions (NAV 2016 and higher) |
This is possible with Field and Dataset security. This module allows you to easily split up the permissions for different documents.
This means, that a user has to have two permissions to i.e. write to a table: first the indirect permission to write to the table and then the right to execute an object, that itself has the permission to write directly to the table.
With this release letter we inform you of the release of the new 2-Controlware version 10.03. This release contains new functionality as well as bug fixes. 2-Controlware 10.03 is available for versions 2016, 2017, 2018 of Dynamics NAV and Fall ’18 (R13) and Spring ’19 (R14) of Business Central On Premise. Versions for 2013 R2 and 2015 are available on request. Dynamics NAV version 3.xx, 4.xx, version 5.xx, NAV 2009, NAV 2009 R2 and NAV 2013 are no longer supported. It is not necessary to update your NAV license to be able to use the updated and new objects.
2-Controlware 10.03 has incremental improvements on the existing functionality and several bugs are fixed. See below a description of the most important changes.
Tenant permissions: In Business Central tenant permissions are used for User-Defined permission Sets. Navigation on this type of permissions did not work correctly on all pages. This is optimized. Also the following reports have been modified in BC to handle the tenant permissions correctly:
Permissions per Permission Set
Permissions per User
Permissions per Organization Role
When using Dataset Type ‘Visible’ it is no longer possible to select a Table Relation because you are not able to do the setup based on the related table.
Renaming G/L account No.: When renaming an G/L account with more than 10 characters, the system give an error. This issue is solved.
Analysis Permissions: When analyzing all Standard Competences without a filter, the system also gave results on read permissions object 0. This was not correct if it had not been included in the Standard Competence. The issue is solved
Combination Filter: When using wildcards and a combined filter in a Mandatory field setup the system did not check on this. The issue is solved.
Validation Type error: a Mandatory Field with Validation Type error did only give a warning. It was possible to close the page without changing the field. This is fixed.
Minimum and Maximum Length: A Mandatory field Not Blank and only en Maximum Length did not work. This is solved. With only a Maximum Length you have to fill in at least something up to the maximum length.
Synchronize Organization Role: When synchronizing one organization role, permissions in this organization Role were removed for all users. This went wrong with the synchronization Type setting ‘Replace Permissions Sets’. This issue has been resolved.
For a new installation see the installation instructions.To perform an update installation from a previous 2-Controlware (= 2CW) version see the table below. The fob’s of version 4.01.01(VM) is available upon request.
Import ( with replace all) and compile version 2-Controlware 10.03 (“NAVW1x.xx,2CW10.03.fob”).
For NAV 2016 and up, the customization of Codeunits 1, 40 and 423 needs to be removed. If you have no other add-ons, you can – after consulting your NAV-partner – replace these Codeunits with the default NAV objects.
Important!
Please be sure to Replace All existing objects. Default action for some objects is Skip or Merge. Furthermore, always check if Menusuite 1052 is available.
There is no impact on permissions.
To be able to use the new objects you do not have to update your license.
The 10.03 update-package consists of one fob-file:
NAVW1x.xx,2CW10.03.fob: the objects can be identified by version number 2CW10.03.===Dynamics NAV version ===
2-Controlware 10.03 is available for versions 2016, 2017, 2018 of Dynamics NAV and Fall ’18 (R13) and Spring ’19 (R14) of Business Central On Premise.
No new setup is involved.
With this release letter we inform you of the release of the new 2-Controlware version 10.02. This release contains new functionality as well as bug fixes. 2-Controlware 10.02 is available for versions 2016, 2017, 2018 of Dynamics NAV and Fall ’18 (R13) and Spring ’19 (R14) of Business Central On Premise. Versions for 2013 R2 and 2015 are available on request. Dynamics NAV version 3.xx, 4.xx, version 5.xx, NAV 2009, NAV 2009 R2 and NAV 2013 are no longer supported. It is not necessary to update your NAV license to be able to use the updated and new objects.
2-Controlware 10.02 has new features and incremental improvements on the existing functionality. Furthermore, several bugs are fixed. See below a description of the most important changes.
User Groups in Authorization Monitoring:If user groups are used within Dynamics, these can now be included in the analysis of permissions and conflicts. To activate this, the new option “Analysis with User Groups” must be activated within the Authorization Monitoring Setup. This will include the user groups in the analysis. The source type User Group can now be found within the analysis result. Also, a filter is included for this source type. You can also link a user group to a standard competence, similar to linking an organizational role. After a user group has been linked, the analysis result of the user group will automatically be Finding Accepted, and can be automatically approved.
Standard Competences Filter:in the analysis permissions page, it is now possible to see per line which standard competences apply to that line so you can easily filter. The standard competence filter will be included as a column, where the applicable standard competences for these results are shown.
Lookup user fromanalysis permissions page: When using a specific license, the link to the user in analysis permission results did not work properly. This issue has been solved
Incorrect property NAV Enabled: When using a specific license, the property ‘NAV enabled’in the analysis monitoring was not correctly presented. This issue has been solved.
Source nameWhen using a specific license, the name of the source in the analysis results was not shown.This has been fixed, so the name of the user in question is shown.