Viewing Historical Rides GPS Data (Multiple KMLs)

BACKGROUND
I’ve been riding motorcycles for about 5 years, and pretty much every ride have used Geo-Tracker to record the GPS co-ordinates of the route and create a KML file of the route (which I store in Google Drive). I was interested in finding new areas to explore so wanted to view all of the areas I’ve ridden, and highlight areas I’d missed.

Below is the map I had at the end of this process, which was created in Google Earth Pro (the Windows version – it’s free). Each line is coloured by the year the route was made.

As you can see I live in the Midlands, near Coventry. Below is a close-up of that area, plus I’ve highlighted with yellow circles areas that I’ve not ridden:

THE PROCESS
I found it easier to group all the KML’s I have by year and then put them into separate folders. You don’t need to do this and can use with a single folder of KML’s if you wish.

Convert multiple KML’s to KMZ
This simply packages multiple KML’s into a single KMZ file which makes the routes easier to handle in Google Earth.
Install and Start Google Earth Pro, and add a new Folder (either call it the year, or if not grouping by ear something like ‘ROUTES’):


Then, select all of your KML files for that year and drag them exactly onto the new folder. You should now have something like:

Right-click on the year and then ‘Save Place As’, then enter the file name and save. Then, right-click on the year again, click ‘Delete’ and click on OK.
Follow this process for all the years you want to process or just the one route if you prefer

Add all KMZ’s to Google Earth
Still in Google Earth Pro, there should be no folders, so create a new one. I call it ‘By Year’.
Then, drag all of the KMZ files (NOT the KMLs) onto this folder.
Then click on the right arrow symbol next to each sub folder to minimise it:

Double-clicking on the folder ‘By Year’ should show all your routes on the map, the next step is styling and colouring these.


Styling and Colouring
To change the widths of all lines, and remove the yellow pin (icons) right-click on ‘By Year’, then ‘Properties’. Click on the tab ‘Style Color’, then the button ‘Share Style’.
Change the highlighted values as below:

Click OK and OK.

For each year, right-click on it, then ‘Properties’. Click on the tab ‘Style Color’, then click on the Color to change it. Click OK and OK.

Final Steps
On the Layers section I turn them all off except for ‘Labels’ as it looks cleaner, but you can show ‘Places’ or ‘Roads’ if you prefer:

For easier viewing in Google Earth Pro I prefer the top-down view, the keyboard shortcut for this is ‘u’. Here are other Google Earth Pro keyboard shortcuts.

Closing Thoughts
I store and add new areas to explore in a Google My Maps, and for route planning, and navigation I use Google Maps on my phone with one headphone plugged in.

Photo by José Martín Ramírez Carrasco on Unsplash

Downloading Salesforce Or Dynamics CRM Data To SQL Server (For Free)

Summary
As my background is in SQL Server databases I find it a lot easier to query data in SQL. The process in this post will show you how to carry out a manually run download of data from multiple objects in Salesforce or Dynamics CRM – using trial versions of KingswaySoft Tools in SSIS (SQL Server Integration Services).

I’ll be connecting to a trial Salesforce instance and downloading Contacts to SQL Server. Whenever I say Salesforce it can be interchanged with Dynamics CRM – the KingswaySoft CRM Components only slightly differ in how the objects and fields are selected.

I’ll assume that you have no prior knowledge of Visual Studio or SSIS, although you will need know how to connect to and run queries against a SQL database.

Requirements
You’ll need:
* A Salesforce or Dynamics CRM instance (this can be a trial)
* A Windows PC/Server with SSIS installed.
* A SQL Server instance (this can be hosted on a VM, or directly in Azure) with an empty database.

I’ll be using a VM running Windows 10 with the below items installed:
* SQL Server 2012 SP4
* Visual Studio 2015 Update 3
* SQL Server Data Tools for Visual Studio 2015
These are the versions I had had installed on my VM – newer versions of Visual Studio and SQL Server Data Tools should run fine, though I haven’t tested these.

Install KingswaySoft SSIS Integration Toolkit
We’ll install a trial version of this to allow us to connect to Salesforce. The only limitation that the trial version has is that you can only run it in the development environment. This means that the solution will not work if we deploy the SSIS package to a server – this will require a license.

Download links
SSIS Integration Toolkit for Salesforce –
https://www.kingswaysoft.com/products/ssis-integration-toolkit-for-salesforce/download
SSIS Integration Toolkit for Microsoft Dynamics 365 –
https://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-365/download
On the download form KingswaySoft requires a corporate/business email.

If you haven’t got such an email you can use a temporary email from https://10minutemail.com/

Installation
Once you have the installation files just install the toolkit taking the default options.


Create A Visual Studio Project
In Visual Studio create a new project of type ‘Integration Service Project’

If this options isn’t available ensure that SSIS and SQL Server Data Tools are installed correctly.

Add and Configure Connection Managers
We will make two connections here – one to Salesforce, and the second to SQL Server.

Connection to Salesforce
Right-click on the “Connection Managers” area on the bottom of the screen and select “New Connection…”

In the next dialog select the “Connection Manager for Salesforce” then click “Add”

You will now need to choose the Instance Type, and enter your (Salesforce) User Name & Password, and Security Token.
The Security Token is used to ensure that if someone has your Salesforce account details they can’t connect to the Salesforce data connector without it.
If you know your Security Token then enter it, otherwise request a new Token. This is the Salesforce Help Page for Security Tokens – https://help.salesforce.com/articleView?id=sf.user_security_token.htm&type=5
Click on “Test Connection” and you should see a message stating that the Test Connection Succeeded. If not, check the details you have entered.

Connection to SQL Server
Right-click on the “Connection Managers” area on the bottom of the screen and select “New OLE DB Connection…”

On the next dialog click “New” to make a new data connection, then enter the details for your SQL Server connection.
My connection is shown below, connecting to the local SQL Server (this is the dot in the Server Name) using SQL Server authentication and a database called “SFDB-Blog”
Click on “Test Connection” and you should see a message stating that the Test Connection Succeeded. If not, check the details you have entered.
Click on “OK” to close the Connection Manager dialog

You should now have 2 connections in the “Connection Managers”. Don’t worry about what they’re named – that’s not important.


CREATING THE SALESFORCE SOURCE COMPONENT
From the SSIS Toolbox drag a “Data Flow Task” to the package

Right-click on the Data Flow Task and rename it to “DFT – Download Contacts”
Double-click on “DFT – Download Contacts” to edit it. You will now be in the Data Flow edit screen:

From the SSIS Toolbox drag a “Salesforce Source” component to the Data Flow

Double-click the newly added “Salesforce Source”. Click on the Connection Manager dropdown and select the value Salesforce Connection Manager.
For the Source Object select the object you want to download. I will be downloading the Contact object.

Click on the “Columns” on the left and select the fields you want to download. As a general rule of thumb it’s better to download only the fields you want – but always download the Id column (the record unique identifier). You may want to click on the top check box to uncheck all fields initially.

Click the ‘OK’ button to close the dialog

CREATING THE SQL DESTINATION COMPONENT
From the SSIS Toolbox drag a “OLE DB Destination” component to the Data Flow. Rename the newly added “OLE DB Destination” to “OLEDB – Contacts”.
Click on “Salesforce Source” component, and then drag the connector arrow to the “OLEDB – Contacts” component. It should now look as below:

Double-click “OLEDB – Contacts” to open the edit dialog.
We will now use the component to generate the SQL to create the SQL Server table to hold the data. Click the “New button” and the SQL will be displayed:

Copy this SQL and paste it into a connection to your SQL Server – I’m using SSMS (SQL Server Management Studio). In the script change the text “OLEDB – Contacts” to the name you want to give to the SQL table – I’ll be using “Contacts”. Execute the query to create the SQL table.

Back in Visual Studio – Cancel the open script dialog, and in the main dialog on the table dropdown select your new table.

Click on the ‘Mappings” on the left to ensure that all columns are mapped correctly – it should look similar to below where all columns are connected from left to right:

Click “OK” to close the dialog.

FIRST RUN OF DOWNLOADING DATA
We are now ready to run and test the first table downloading data from Salesforce.
Click the “Start” button

After a few seconds (or minutes depending on the number of records in your Salesforce data object) it should complete successfully:

To ensure it has worked correctly view the data in your SQL table:

CLEAR DATA BEFORE RUN
When the routine runs we want to clear the existing SQL data otherwise it would append duplicate data to the SQL table.
To do this click on the “Control Flow”, and drag an “Execute SQL Task” component to the main pane, above the component “DFT – Download Contacts”:

Right-click on the newly added “Execute SQL Task” and rename it to “SQL – Truncate Contacts”
Double-click on ” SQL – Truncate Contacts” to edit it. For the “Connection” select the SQL connection, and in the “SQLStatement” enter as below (using your SQL table name):

Click OK to close the dialog, and as then connect the components by dragging the arrow from “Execute SQL Task” to “DFT – Download Contacts”
Click on the “Start” button to run the routine and once complete it should look as below.


ADDING MORE DOWNLOAD TABLES
You can repeat the above process to add additional tables. I’ve added the Account table – as shown below:

WRAP UP
I hope you found this post beneficial – Thanks for reading 🙂

Photo by Markus Spiske on Unsplash

Power Apps – Using Variables in ForAll()

The ForAll command is very handy if you want to loop around a collection and perform some calculations.  One limiting feature is that you can’t use variables in the ForAll loop, using UpdateContext or Set.

There is a workaround, which is to create and use a single record collection to hold the values which can then be used as variables. The downside is that it does make the “code” not as straightforward to read.

This is the process I use:

Add a line to the Form OnVisible to reset the collection (unless you’re using the variable like a Global in which case put it in the App OnStart).

ClearCollect(colVariables, {colvarVariableOne:””});

To read the variable value use

First(colVariables).colvarVariableOne

To set the variable value use:

Patch(colVariables, First(colVariables), {colvarVariableOne:”NEW VALUE”});

One thing that can catch you out is that collection column types are set when the collection is initiated so set the default value to be of that type. For example to create a datetime column use:

ClearCollect(colVariables, {colvarHoliday:DateTimeValue(“01/01/2000 00:00”});

Photo by Tine Ivanič on Unsplash

Power Apps – Photo Camera Selection

For any mobile app these days it’s normally a requirement to be able to take photos for reference, and the new App I’m working on is no different.

Adding a photo facility to an App is very straightforward – put a Camera control onto a screen and add code in the OnSelect to take the Photo and add it a collection using Patch. Here’s my code to do this:

Patch(colPhotos, Defaults(colPhotos), {Value:Camera1_1.Photo, PhotoDateTime:Now()});

Of course most devices have multiple cameras these days – at minimum one front facing, and one rear facing.   PowerApps allows you to change the camera in use by setting the Camera property of the Camera control (!) with a number.  

These are the devices I have, and their associated Camera numbers

Samsung Galaxy S7 
0 = Rear Camera, 1 = Front Camera, 2 = Not Available

IPhone 7
0 = Rear Camera, 1 = Front Camera, 2 = Front Camera

Samsung Galaxy S9
0 = Front Camera, 1 = Front Camera (Zoomed in), 2 = Rear Camera

NOTE: All the devices I’ve tested only have two cameras, 1 front and 1 rear – though there are many newer devices which have several rear cameras.  I’m guessing that they would have Camera numbers from 2 onwards though I can’t verify this.

So it appears that the rear camera, which the user would most likely use, is not always 0.  To allow the user to change camera I use a variable and store this to local cache so the App will start on the same camera the user was last on.

Code wise – the Camera control’s Camera property is set to varCameraNumber.  There is a ‘Switch Camera’ button which loops around Cameras 0 to 2 using the code:

If(varCameraNumber=2,
UpdateContext({varCameraNumber:0});,
UpdateContext({varCameraNumber:varCameraNumber+1});
);

This is the code in the Screen OnVisible section.  It uses LoadData with the 3rd parameter IgnoreNonexistentFile set to True.  This is for when the App is first run and the file doesn’t exist.

LoadData(colCameraNumber, “colCameraNumber” ,true);
If(CountRows(colCameraNumber) = 0,
UpdateContext({varCameraNumber:0});,
UpdateContext({varCameraNumber:First(colCameraNumber).CameraNumber});
);

The last bit of code is at the end of the OnSelect of the ‘Switch Camera’ button.

ClearCollect(colCameraNumber,{CameraNumber:varCameraNumber});
SaveData(colCameraNumber,”colCameraNumber”);

That’s it, a fully functioning Camera control allowing switching of the viewing camera, which is saved per session too.  🙂

PowerApps – Check digit calculator

I’m quite new to the world of designing and developing Power Apps and came across an interesting puzzle during the creation of my first App. I can’t go into specifics about what the App is used for, suffice to say that there are parent records created on the server and the App is used to create multiple child records linked to the parent record. The App also has a requirement to be able to function when the device is offline (i.e. no mobile data available).

In the scenario where the users device is offline, the user will call the office and be told the ID of the parent record they are working with. The ID field is just an incremental counter on the backend and the user would enter this onto their device. To ensure that they are working with the correct parent record and also to bypass user typing errors, I thought it would be a good idea to add a check digit to this number.

I initially posted onto a PowerApps forum to see if someone had designed code for this method previously. There were no replies so I went about writing the code myself.

Firstly, there was the matter of choosing an algorithm that would generate the check digit. The Luhn Algorithm seemed to fit the requirements – it wasn’t overly complex and is used on credit card numbers so must be accurate. I used a variant of the algorithm using modulus 10.

These are the steps for the calculation (taken from this website).

Summary: Given an identifier, let’s say “139”, you travel right to left. Every other digit is doubled and the other digits are taken unchanged. All resulting digits are summed and the check digit is the amount necessary to take this sum up to a number divisible by ten.

Detail:
1. Work right-to-left, using “139” and doubling every other digit.
9 x 2 = 18
3 = 3
1 x 2 = 2
2. Now sum all of the digits (note ’18’ is two digits, ‘1’ and ‘8’). So the answer is ‘1 + 8 + 3 + 2 = 14′ and the check digit is the amount needed to reach a number divisible by ten. For a sum of ’14’, the check digit is ‘6’ since ’20’ is the next number divisible by ten.

Below is a recording of the App being run using the test value of 139

The code took a fair bit of time to develop as PowerApps is not really designed to work with loops and arrays, hence the need for several collections. Below is the full code:

ClearCollect(colStringChars, Split(txtInputValue.Text, ""));

UpdateContext({varOddEvenOffset:If(Mod(CountRows(colStringChars),2)=1,1,0)});

ClearCollect(colStringSeq, {RecNum:0});
Clear(colStringSeq);
ForAll(
       colStringChars,
       Collect(
               colStringSeq,
               {
                 ValueToSum:If(Mod(CountRows(colStringSeq) + 1 + varOddEvenOffset,2)=1,Result,Result * 2)
               }
       )
);

UpdateContext({varStage1Total: Concat(colStringSeq, ValueToSum)});

Clear(colStage2Numbers);
Collect(colStage2Numbers, Split(varStage1Total, "")); 

UpdateContext({varStage2Sum: Sum(colStage2Numbers, Result)});

UpdateContext({varFinalCheckDigit: If(Mod(varStage2Sum,10) = 0,0, 10 - Mod(varStage2Sum,10))});

UpdateContext({varFinalOutput: txtInputValue.Text & varFinalCheckDigit});

I’m sure the code could be optimised to have less lines but I wanted to keep it readable (for when I come back to it in 6 months times to see how it works 😄).

That’s it – I’d appreciate any comments about the method I used, or the actual code developed.

I’ll be blogging every few weeks in the future about any more trinkets of PowerApps information I come across. Thanks!

Header Photo by Markus Spiske on Unsplash

Nuneaton Motorbike Ride

A GoPro recording of a ride around the North of Nuneaton area on 20/05/2020.

I’ve switched video editing software from Cyberlink PowerDirector to Adobe Premiere – there was a bit of a learning curve but it didn’t take long to get the hang of it, and it has a load more features than PowerDirector.

Route

 

Hosting an SSRS Report in a Dynamics CRM 365 Form

Introduction

So, as we all know, reports can be designed using Visual Studio or Report Builder and uploaded to the Reports section in CRM.  But, what if you want to display the report directly in a form?   This is what a client wanted to do as it would enable their users to view non CRM data in a CRM form.  In the past they had issues with browser security and had not gotten very far.

 

Testing the Report

To test this I utilized an Azure VM which had SQL Server 2014 including SSRS (SQL Server Reporting Service) installed onto it.  A SQL Table and a sample report were created, which looked as below:

2017-10-23_11h00_17

Next, a new CRM form was created and an IFRAME added to it with the URL (http://crm2015dev/Reports/Pages/Report.aspx?ItemPath=%2fTestSSRSReport%2fTestSSRSReport) of the above report.  The number of the rows for the IFRAME were set to 12, from the default of 6.

On opening of the CRM form 2 security prompts were displayed, and required accepting before the report would be shown.  These are the 2 prompts:
2017-10-20_12h12_06

2017-10-23_14h51_27

 

Removing the IE Security Prompts

A bit of searching online and in forums revealed the 2 Internet Explorer settings that needed to be changed to remove the prompts.  Both settings are in the same dialog.  To get to the dialog open the IE internet options dialog, then the Security Tab.
Select ‘Local intranet’, then click the button ‘Custom level…’

2017-10-23_11h19_20

Locate the option ‘Display mixed content’ and change it to ‘Enable’2017-10-23_11h20_54

Locate the option ‘Websites in less privileged web contents zone can navigate into this zone’ and change it to ‘Enable’

2017-10-23_11h23_47

After restarting IE the the CRM form now displays the SSRS report with no security prompts:

2017-10-23_11h28_21

 

Conclusion

Once you know the settings to change for IE it’s pretty straightforward to remove the security prompts.  I should say that I’m no expert in IE and I’m not sure of the security implications of making these changes so if you’re in a corporate environment check these settings with your Desktop Team.

NOTE: These are the version numbers of the applications:
CRM: Microsoft Dynamics 365 Version 1612 (8.2.2.111) (DB 8.2.2.111) online
SSRS: SQL Server 2014 SP1 (Version 12.0.4100.1)
Internet Explorer (IE): Version 11.0.9600.18817

 

Photo by William Iven

Using Power BI with Dynamics CRM 365

Background

Recently I was asked by a client to look into the possibility of using Power BI Dashboards/Reports and displaying them inside their CRM Online instance.  One of the priorities for the investigation was to ensure that the dashboards/reports could be filtered for the current user.  For my test I want to be able to build a dashboard that shows the CRM activities for the logged in user.

 

Initial Investigation

Power BI licensing looks very reasonable.  No cost to author reports and publish online, and $9.99 (approx £7.50) for the Pro version which gives the ability to be able share reports and schedule data refreshes.  Also, a quite lengthy 60 day trial is available which is plenty of time to see if it’s right for you.  After a bit of reading it seems that Power BI can be set to suck in data from a number of data sources, store it internally, and then build very nice looking charts from the data.  The data can be set to refresh at regular intervals, though the minimal refresh period is 1 hour, and this scheduled data refresh requires a Pro license.

 

Data preparation in CRM

To enable me to test the reports/dashboards I’ll be using the CRM Activity entity.  There are a few dozen records in my test CRM instance, and I am the owner of one of the activity records.

 

Power BI Desktop

Using my MSDN account I created an Azure Virtual Machine running Windows 2012R2 and installed the Power BI Desktop.  It’s free to install and use (the installer is 143MB).PowerBIDesktop

Adding Data to Power BI

Clicking on the Get Data button brings up a dialog with a list of many available data source types.   Entering 365 into the search box filters the list allowing the selection of ‘Dynamics 365 (online)‘.

DataSources

The next dialog prompts for the Web API URL for the CRM instance:

EnterWebAPIDialogThe Web API URL for your instance of CRM can be found by navigating to Settings\Customizations, clicking on Developer Resources, and copying the value out of Service Root URL.  Selecting the value with your mouse and then using Ctrl+C will copy the value.

WebAPIURL

Next is a prompt for my CRM login credentials.  Having entered the credentials a list of entities that are in the CRM instance is displayed.  I selected systemusers and activitypointers – filtering the list using the search box makes it easier to find an entity:

FilterEntityList

Filtering Data by User

The data can be filtered to only show data for the current user by using Row Level Security (RLS).  RLS is basically how you can configure the data to be filtered by criteria which is configured per role.  I will simply be adding a role to filter the systemusers table for the logged in user, and as the activitypointer table is joined to this table, it will be filtered also.
The first step is to make a one-to-many join between the two tables.   This is achieved by clicking on the relationships tab to go to the mapping screen.

RelationShipsTab

From here the two tables are visible, and can be resized to view more of the columns.  To create the link drag the systemsusers ownerid column to the activitypointers _ownerid_value column:

DragColumns

There should now be a line linking the two tables with a 1 on the systemusers side and a star on the activitypointers side.  Double clicking the join allows the properties to be viewed – it should be:

RelationshipDialog

The next step is to add the filter.  Click back to the Home tab and then the Manage Roles button:

HomeTab  ManageRolesButton

This is the filter that should be added:

RLSFilter

Displaying Data

We can now add the data to the main view area.  For our test a simple grid of data will be sufficient.  On the list of Visualizations click the Table icon:

TableIcon

This inserts a table onto the main view area and columns from the far right table can now be dragged onto it.  Formatting the table is done by selecting the table, then the ‘Paint’ icon and changing the values.

PaintIcon

This is how mine looks:

FinishedTable

 

Testing

It’s very important to test what we’ve done as we don’t want users seeing other users data.  Click on the ‘View as Roles’ button:

ViewAsRoles

Then enter the following selections, using your email:

ViewAsRolesEntry

The data in the table should now be filtered to display just your data:

FilteredDataInTable

Publishing

The next step is to Save and Publish the Dashboard to Power BI Online.  This is done by clicking File\Save (mine is called CRMUserActivities) then File\Publish  and then Publish to Power BI.
Next, go to the Power BI application website.  The next step is critical also, as we’re going to configure which users the Filtering applies to.  Expand ‘My Workspace‘ and under DataSets there will be your dataset.  Click on it, then click the ellipsis, and the menu option Security:

DataSetSecurity

Now, all the users should be added to the Row-Level Security.  This can be a bit tedious but I don’t think there is any way around this:

RLSMembers

 

Convert Report to a Dashboard

To convert the report to a dashboard that is viewable on CRM the first action is to open the report and click on ‘Pin Live Page’

PinLivePage

A dialog is displayed and ‘New dashboard‘ can be selected and the name of the dashboard entered:

PinToDashboardDialog

By default Power BI displays Report page numbers which can make the dashboard look a bit untidy.  Removing the page numbers is not very intuitive, and it took me a while to figure out how to remove them.  Firstly click on the new dashboard that has been created under the My Workspaces tab:

OpenDashboard

The next step is a bit tricky.  Move your mouse over the dashboard and an ellipsis should appear to the top right.  Click on this ellipsis:

DashboardEllipsis

This brings up another menu, click on the ‘Pencil’ icon:

DashboardSubMenu

Finally, we are on the correct the window.  Untick ‘Display title and subtitle‘ and then click Apply.

DashboardTileDetails

Note: There is also an option on this dialog to ‘Display last refresh time’.  This is the date/time the Dashboard was refreshed by Power BI – an item you may want to show to yours users.

Add the Dashboard to CRM

We’re nearly there..   To enabled Power BI dashboards in CRM there’s a setting under Settings\Administration and then System Settings.  In the Reporting tab there’s an option to enable Power BI Dashboards – set this to Yes:

CRMSystemSettings

In CRM, go to the Dashboards screen, and click on the arrow to the right of the New button, and select Power BI Dashboard:

CRMPowerBiMenu

And finally you can select your Power BI Dashboard

CRMPowerBiDialog

I also added a pie chart and a total count to my dashboard.
An added bonus is that Power BI Dashboards are interactive, in that you can select records in the table, or items in the charts and the other tables/charts will update accordingly.   For example:

Select records in the table

Dashboard-RecordsSelected

Selected items on the chart

Dashboard-ChartAreaSelected

 

Testing Dashboard if you’re the Creator

On creating my first dashboard using this method it didn’t appear to work and after some time I realised that the creator of the dashboard sees all of the data.  This can be a bit confusing – I’ve read it’s because the creator of the dashboard is the owner of the report and the dataset.
There is a method of testing the Dashboard if you’re the creator/owner.  Go back to the  Power BI application website, open the dataset Security settings:

DataSetSecurity

Then select the RLS item, click on the ellipsis, and click on ‘Test as role’

PowerBITestAsRole

You can now test the Dashboard using your data:

PowerBiViewAsTestRole

What Power BI dashboards should not be user for

Power BI dashboards can be made to use live data vis streaming, but the method described above uses data that is updated on a scheduled basis.  This means that they shouldn’t replace normal CRM dashboards where the data is refreshed live, i.e. outstanding customers to call.

Conclusion

I hope this gave you a good introduction to Power BI and CRM and what they are capable of in terms the creation of meaningful interactive Dashboards.

 

Photo by Carlos Muza

Why I Don’t Use Online Backups

In the days when 100MB drives were the norm and Microsoft Office came on 43 floppy disks, in the process of formatting a floppy disk I somehow managed to format my work hard drive.  Computers were single drives then so I lost all of my data and operating system too.

Since then I keep copies of all my important data on USB drives which I keep in work, and bring home monthly to update, the process is described here.  Ideally I’d have to two sets of drives so that the data and backups are never in the same location – I’ll get to that someday.  So for now the backup consists of two drives which are roughly 4TB in total.

4TB might sound like a lot but I have a lot of photos, music and videos to backup.  Using a Canon DSLR regularly along with a GoPro style helmet cam means the storage of photos and music grows rapidly.

This is a breakdown of my usage as of today:
DiskUsage

After a bit of research online I came up with options Backblaze or CrashPlan – they are both reasonably priced and have good reviews.  Unfortunately, CrashPlan was putting an end to the Home Service,  I’m assuming it wasn’t profitable enough for them.

This left me with Backblaze.  It sounded very good, $5 a month for 1 computer – unlimited backup sizes, unlimited file sizes, and backups of external drives.

I signed up for a 30 day trial, and installed the PC client.  The first thing that surprised me was that you can’t easily choose what to backup.  By default it backs up everything, even your OS drive.  The reason they give for is is that most users don’t want to decide what to backup!  I had several very big folders on my drives which I didn’t want backed up, and managed to exclude them (though exclusion folder names apply to all drives).

After a few days it had only backed up about 10GB, which was a long way to go to get to a complete full backup set.  I wondered whether it was because my PC was connected to the ISP router over a wireless connection.  A connection speed test using a wireless connection and wired connection using a laptop came back with the same results – my upload speed was 10Mbps.

Checking the Backblaze client indicated that it was using all the bandwidth available to it, and backing up around 2GB per hour.  Now, my PC is only switched on when I’m using it.  A few years ago it would be switched on 24/7 when I was doing 3D rendering but I don’t do that any more.

So I did a calculation of how long the backup would take to complete using my current bandwidth, if I left it on for 5 hours a day (which is more than I normally use it):
5hrs x 2GB = 10GB/day
2,836GB ÷ 10GB/day = 283 days = 40 weeks

So my first backup wouldn’t complete for 40 weeks!  And this isn’t including the files that get added during that time.  It’s at this point I decided it wasn’t the right time to move online.  The way internet speeds are increasing I’ll have a much quicker upload in the future, and it may be worth paying extra for more bandwidth.

And that’s why I don’t use online backups.

Photo by Thomas Kvistholt

General thoughts put onto screen