Finally we’re allowed back on our bikes, and it was an amazing ride. A circular up North via Nuneaton.
Finally we’re allowed back on our bikes, and it was an amazing ride. A circular up North via Nuneaton.
So, as we all know, reports can be designed using Visual Studio or Report Builder and uploaded to the Reports section in CRM. But, what if you want to display the report directly in a form? This is what a client wanted to do as it would enable their users to view non CRM data in a CRM form. In the past they had issues with browser security and had not gotten very far.
To test this I utilized an Azure VM which had SQL Server 2014 including SSRS (SQL Server Reporting Service) installed onto it. A SQL Table and a sample report were created, which looked as below:
Next, a new CRM form was created and an IFRAME added to it with the URL (http://crm2015dev/Reports/Pages/Report.aspx?ItemPath=%2fTestSSRSReport%2fTestSSRSReport) of the above report. The number of the rows for the IFRAME were set to 12, from the default of 6.
On opening of the CRM form 2 security prompts were displayed, and required accepting before the report would be shown. These are the 2 prompts:
A bit of searching online and in forums revealed the 2 Internet Explorer settings that needed to be changed to remove the prompts. Both settings are in the same dialog. To get to the dialog open the IE internet options dialog, then the Security Tab.
Select ‘Local intranet’, then click the button ‘Custom level…’
Locate the option ‘Display mixed content’ and change it to ‘Enable’
Locate the option ‘Websites in less privileged web contents zone can navigate into this zone’ and change it to ‘Enable’
After restarting IE the the CRM form now displays the SSRS report with no security prompts:
Once you know the settings to change for IE it’s pretty straightforward to remove the security prompts. I should say that I’m no expert in IE and I’m not sure of the security implications of making these changes so if you’re in a corporate environment check these settings with your Desktop Team.
NOTE: These are the version numbers of the applications:
CRM: Microsoft Dynamics 365 Version 1612 (184.108.40.206) (DB 220.127.116.11) online
SSRS: SQL Server 2014 SP1 (Version 12.0.4100.1)
Internet Explorer (IE): Version 11.0.9600.18817
Photo by William Iven
Recently I was asked by a client to look into the possibility of using Power BI Dashboards/Reports and displaying them inside their CRM Online instance. One of the priorities for the investigation was to ensure that the dashboards/reports could be filtered for the current user. For my test I want to be able to build a dashboard that shows the CRM activities for the logged in user.
Power BI licensing looks very reasonable. No cost to author reports and publish online, and $9.99 (approx £7.50) for the Pro version which gives the ability to be able share reports and schedule data refreshes. Also, a quite lengthy 60 day trial is available which is plenty of time to see if it’s right for you. After a bit of reading it seems that Power BI can be set to suck in data from a number of data sources, store it internally, and then build very nice looking charts from the data. The data can be set to refresh at regular intervals, though the minimal refresh period is 1 hour, and this scheduled data refresh requires a Pro license.
To enable me to test the reports/dashboards I’ll be using the CRM Activity entity. There are a few dozen records in my test CRM instance, and I am the owner of one of the activity records.
Using my MSDN account I created an Azure Virtual Machine running Windows 2012R2 and installed the Power BI Desktop. It’s free to install and use (the installer is 143MB).
Clicking on the Get Data button brings up a dialog with a list of many available data source types. Entering 365 into the search box filters the list allowing the selection of ‘Dynamics 365 (online)‘.
The next dialog prompts for the Web API URL for the CRM instance:
The Web API URL for your instance of CRM can be found by navigating to Settings\Customizations, clicking on Developer Resources, and copying the value out of Service Root URL. Selecting the value with your mouse and then using Ctrl+C will copy the value.
Next is a prompt for my CRM login credentials. Having entered the credentials a list of entities that are in the CRM instance is displayed. I selected systemusers and activitypointers – filtering the list using the search box makes it easier to find an entity:
The data can be filtered to only show data for the current user by using Row Level Security (RLS). RLS is basically how you can configure the data to be filtered by criteria which is configured per role. I will simply be adding a role to filter the systemusers table for the logged in user, and as the activitypointer table is joined to this table, it will be filtered also.
The first step is to make a one-to-many join between the two tables. This is achieved by clicking on the relationships tab to go to the mapping screen.
From here the two tables are visible, and can be resized to view more of the columns. To create the link drag the systemsusers ownerid column to the activitypointers _ownerid_value column:
There should now be a line linking the two tables with a 1 on the systemusers side and a star on the activitypointers side. Double clicking the join allows the properties to be viewed – it should be:
The next step is to add the filter. Click back to the Home tab and then the Manage Roles button:
This is the filter that should be added:
We can now add the data to the main view area. For our test a simple grid of data will be sufficient. On the list of Visualizations click the Table icon:
This inserts a table onto the main view area and columns from the far right table can now be dragged onto it. Formatting the table is done by selecting the table, then the ‘Paint’ icon and changing the values.
This is how mine looks:
It’s very important to test what we’ve done as we don’t want users seeing other users data. Click on the ‘View as Roles’ button:
Then enter the following selections, using your email:
The data in the table should now be filtered to display just your data:
The next step is to Save and Publish the Dashboard to Power BI Online. This is done by clicking File\Save (mine is called CRMUserActivities) then File\Publish and then Publish to Power BI.
Next, go to the Power BI application website. The next step is critical also, as we’re going to configure which users the Filtering applies to. Expand ‘My Workspace‘ and under DataSets there will be your dataset. Click on it, then click the ellipsis, and the menu option Security:
Now, all the users should be added to the Row-Level Security. This can be a bit tedious but I don’t think there is any way around this:
To convert the report to a dashboard that is viewable on CRM the first action is to open the report and click on ‘Pin Live Page’
A dialog is displayed and ‘New dashboard‘ can be selected and the name of the dashboard entered:
By default Power BI displays Report page numbers which can make the dashboard look a bit untidy. Removing the page numbers is not very intuitive, and it took me a while to figure out how to remove them. Firstly click on the new dashboard that has been created under the My Workspaces tab:
The next step is a bit tricky. Move your mouse over the dashboard and an ellipsis should appear to the top right. Click on this ellipsis:
This brings up another menu, click on the ‘Pencil’ icon:
Finally, we are on the correct the window. Untick ‘Display title and subtitle‘ and then click Apply.
Note: There is also an option on this dialog to ‘Display last refresh time’. This is the date/time the Dashboard was refreshed by Power BI – an item you may want to show to yours users.
We’re nearly there.. To enabled Power BI dashboards in CRM there’s a setting under Settings\Administration and then System Settings. In the Reporting tab there’s an option to enable Power BI Dashboards – set this to Yes:
In CRM, go to the Dashboards screen, and click on the arrow to the right of the New button, and select Power BI Dashboard:
And finally you can select your Power BI Dashboard
I also added a pie chart and a total count to my dashboard.
An added bonus is that Power BI Dashboards are interactive, in that you can select records in the table, or items in the charts and the other tables/charts will update accordingly. For example:
Select records in the table
Selected items on the chart
On creating my first dashboard using this method it didn’t appear to work and after some time I realised that the creator of the dashboard sees all of the data. This can be a bit confusing – I’ve read it’s because the creator of the dashboard is the owner of the report and the dataset.
There is a method of testing the Dashboard if you’re the creator/owner. Go back to the Power BI application website, open the dataset Security settings:
Then select the RLS item, click on the ellipsis, and click on ‘Test as role’
You can now test the Dashboard using your data:
Power BI dashboards can be made to use live data vis streaming, but the method described above uses data that is updated on a scheduled basis. This means that they shouldn’t replace normal CRM dashboards where the data is refreshed live, i.e. outstanding customers to call.
I hope this gave you a good introduction to Power BI and CRM and what they are capable of in terms the creation of meaningful interactive Dashboards.
In the days when 100MB drives were the norm and Microsoft Office came on 43 floppy disks, in the process of formatting a floppy disk I somehow managed to format my work hard drive. Computers were single drives then so I lost all of my data and operating system too.
Since then I keep copies of all my important data on USB drives which I keep in work, and bring home monthly to update, the process is described here. Ideally I’d have to two sets of drives so that the data and backups are never in the same location – I’ll get to that someday. So for now the backup consists of two drives which are roughly 4TB in total.
4TB might sound like a lot but I have a lot of photos, music and videos to backup. Using a Canon DSLR regularly along with a GoPro style helmet cam means the storage of photos and music grows rapidly.
This is a breakdown of my usage as of today:
After a bit of research online I came up with options Backblaze or CrashPlan – they are both reasonably priced and have good reviews. Unfortunately, CrashPlan was putting an end to the Home Service, I’m assuming it wasn’t profitable enough for them.
This left me with Backblaze. It sounded very good, $5 a month for 1 computer – unlimited backup sizes, unlimited file sizes, and backups of external drives.
I signed up for a 30 day trial, and installed the PC client. The first thing that surprised me was that you can’t easily choose what to backup. By default it backs up everything, even your OS drive. The reason they give for is is that most users don’t want to decide what to backup! I had several very big folders on my drives which I didn’t want backed up, and managed to exclude them (though exclusion folder names apply to all drives).
After a few days it had only backed up about 10GB, which was a long way to go to get to a complete full backup set. I wondered whether it was because my PC was connected to the ISP router over a wireless connection. A connection speed test using a wireless connection and wired connection using a laptop came back with the same results – my upload speed was 10Mbps.
Checking the Backblaze client indicated that it was using all the bandwidth available to it, and backing up around 2GB per hour. Now, my PC is only switched on when I’m using it. A few years ago it would be switched on 24/7 when I was doing 3D rendering but I don’t do that any more.
So I did a calculation of how long the backup would take to complete using my current bandwidth, if I left it on for 5 hours a day (which is more than I normally use it):
5hrs x 2GB = 10GB/day
2,836GB ÷ 10GB/day = 283 days = 40 weeks
So my first backup wouldn’t complete for 40 weeks! And this isn’t including the files that get added during that time. It’s at this point I decided it wasn’t the right time to move online. The way internet speeds are increasing I’ll have a much quicker upload in the future, and it may be worth paying extra for more bandwidth.
And that’s why I don’t use online backups.
Photo by Thomas Kvistholt
Everyone should have a backup of their important data in case your main hard drive fails and everything’s lost forever. Oh, and it’s not if, it’s when your main hard drive fails. If the worst does happen you may be lucky and be able to retrieve some files from your drive but it’s hit and miss, and can be very expensive.
This lifehacker post discusses average life times for hard drives, and the average times are between a few months and five years. I backup all important data to 2 external USB drives. I’ll describe the process I use for each backup drive below.
First of all I put a file in the root directory of all the drives (main and backup) so that the backup procedure can test for the existence of drives before running. The file format I use is Backup-E.TXT where E is the drive letter.
Secondly I create a batch file which does the copying process for each backup drive.
(text of batch file here)
Lines 7-8 check for the existence of the main and backup drives, and if not found will jump to Line 23 and no backup will occur.
Line 23 is the command that actually backs up the data using the excellent Robocopy which can be installed from Microsoft Technet. It only copies updated files so the whole process is much quicker than copying the full data again.
If the backup runs successfully it will show the Start and Finish times so that you can see how long the backup took. I normally start it and leave it for an hour or two.
And there we have it, all data backed up. Don’t forget to take the drives to an offsite location. I keep mine at my work office but it could be a family members or friends house.
Photo by Patrick Lindenberg
This is the first post of Harjit Samra new blog. The old blog can be found at harjitsamra.blogspot.com – I thought I’d keep it for historical reasons.