EverLog 2020

For years now I have been keeping a daily log of events and information in a daily log using Evernote. I created an application to build a journal template, essentially an Evernote note with formatted dates and other information for the year. The initial article is here. A followup article details Google Calendar integration into EverLog.

Completed 2020 Files

The completed 2020 EverLog format file is available for download here. It can be directly loaded into Evernote with the File - Import menu.

If you don’t use Evernote you can still keep a log with Microsoft Word and this docx file.

Sample output of EverLog 2020 - Three day weekend in the USA

EverLog Application Source

The complete source code for the C# Windows application is here in Github.

All Things Open 2019

On October 13-15 I atteneded the All Things Open conference in Raleigh, NC. The vast array of presentations and workshops drew a crowd of about five thousand to the Raleigh Convention Center. Here is a brief summary of the sessions I attended. There were always three or more topics that I wanted to attend in each time slot so making a choice was often difficult. I spent an hour or so each night before the sessions just reading their descriptions and listing my targets for the next day.

Blockchain Convergence with John Wolpert, Jerry Cuomo, and Joe Lupin

Forking a blockchain codebase, especially while it is running, is a powerful move that forces community consensus. Ethereum 2 will bring a lot of new capabilities and will be released in three phases beginning in the first quarter of 2020 with the next two by the end of the year. Lupin described version 2 as a maximum decentralized system. Cuomo described some case studies such as the Media Ocean project that manages the huge Nike advertising budget. He also emphasized the need for a Certificate of Originality to avoid historical problems such as the Linux / SCO lawsuits in early open source development.

PASETO - Randall Oktadev

This was a spirited overview of platform-agnostic security tokens (PASETO) which were created to overcome the short comings of JSON Web Tokens (JWT). Like JWT’s, PASETO have cryptographically signed Base 64 encoded JSON data. A PASETO is different in that it can only be used once and is only valid for a few seconds. A major problem with JWT’s are their use with extremely long durations which leave them open to compromise. PASETO’s also have local and public versions with the local version supporting encryption.

Open Source in Government - Amin Mehr

The federal government now has a mandate for all agencies to have 20% of their code open sourced. A guide to this code can be found at code.gov which is an informational layer over the government’s GitHub repository. The government spends an estimated six billion dollars annually on code development.

Advanced Blockchain Technology - Jim Zhong

This talk concentrated on privacy and scalability. Data isolation is implemented by using separate state trees or having states partitioned. This can also be accomplished using separate blockchains but the overhead for this can be high due to the large number of permutations. Address generation using a single key per transaction is used to create hides trading patters. Trusted compute is implemented using a Zero Knowledge Proof (ZKP) or a Trusted Execution Environment (TEE). TEE is a hardware-based solution while a ZKP is implemented in software. ZKP essentially means “Show you know item A without divulging Item A.” An example using a Sudoku puzzle was presented with the steps of permutation of numbers and exchange of masked “commitments”. I’m sure ZKP has been the subject of many doctoral theses.

Open Source Mapping - Leila Alderman

Google maps are great but their details of an area are limited by what is called the Starbucks radius - what is economically feasible for a map provider to cover. OpenSteetMap.org offers greater detail and coverage in many countries that Google or Bing maps does not. These open source maps often have more data points such as bicycle racks and even tree location in parks. There are two ways data is entered into the system. There is GPS trip tracking and map tracing. Leila encouraged everyone to help by using map tracing. This involves drawing with a mouse drag over satellite photo.

Modernizing .NET Applications with Docker - Steven Follis

There are many .NET applications that were written 15 to 20 years ago and are not ready for cloud deployment. Docker offers a convenient way to update applications. Steven gave a demo using a simple ASP.Net 2.0 application that was originally published on the pre-GitHub site Codeplex. Start with a representative application in your company preferably one that has an application owner with knowledge of how to install it. Leave out the database in your initial attempt since this usually has complex problems. The Docker Web Management Service should be used to help with Active Directory authentication / authorization. Not all Windows domains are joined but you can pass in your AD using the credential file gmsa-cred-spec.

Telco’s Aren’t the Future, You Are - Coco Tang

Helium.com is developing an alternative for IOT communications to reduce the cost of connecting to the internet through telco connections. Their system uses internet connected routers that use the 900 MHz Industrial, Scientific, and Medical band (ISM). This band does not require a license. Hotspots that carry network traffic earn a new token also called Helium. The network is designed for small amounts of data and packet size is 24 bytes. Hotspots are being sold for $495 or you can build one using a Raspberry Pi and a LoRa radio.

Advanced Git - Brent Laster

This session went well beyond setting up a repository and making commits. Rebase, stash, reset, revert, rerere, and rebase were explained in detail. Brent fills his 45 minute session with a constant barrage of information and you leave with a good idea of the workings of many of these commands many of which I have never used.

Web Performance and the Mini Profiler - Nick Craver

Nick of Stack Overflow gave an overview of building a well performing web site. He noted that simplicity is a desired feature as much as performance is. The importance of measuring performance was stressed. The Mini Profiler is a simple but effective profiler for .NET, Ruby, Go and Node.js. It is open sourced and updated frequently.

Adding Google Calendar to EverLog

With the advent of 2019 I wanted to create a new daily calendar log with my EverLog application. My daily log is a place where I record notes on programming, applications, machine, and other information. Updates to applications, error reports, meeting notes, code snippets, and links to articles I would like to read later are all kept in my log.

I record events by the date with one Evernote notebook per year. I thought it would be a good exercise to import my Google Calendar events into my daily log.

UI Changes

Text boxes for a Gmail Address and for a holiday calendar were added. I have defaulted the holiday box to the English USA calendar. You can skip adding adding either of these by clearing their “Include” checkbox.

Birthday Entry

Digging in to the Google Calendar API

Google provides a nice .Net client library. In Visual Studio you can search for a new package to find it or use the Install Command:

1
PM>  Install-Package Google.Apis.Calendar.v3

At the Google developer site I started with the Dot Net quick start to enable the Google Calendar API. This involves creating a JSON credential file. The quick start instructs you to save the file in the project but it can be anywhere.

I found the quick start app lacking and I found a very nice collection of sample code by Linda Lawton. The part of the repository I used is here in Github. The only change I made was to remove the duplicate SampleHelpers from all but one file I used.

Authenticating with OAuth

In FormMain.cs the file path to the client secret JSON is set in the variable ClientSecretPath. We call to YearList.GetService with a scope of CalendarService.Scope.CalendarEventsReadonly. If we use CalendarService.Scope.Calendar would be requesting read/write/delete. It is always best to request only the permissions you need. We then call the API function OAuth2.GetCalendarService. This will show this dialog with your email and application name:

Google OAuth Dialog

Google.Apis.Calendar.v3.Data.Events

A list of Event items is returned by 2 API calls to CalEvents.list. One gets the personal calendar and the other gets one of the Google standard holiday calendars. To use this information I needed to put the information into an array of new objects with the data reformatted so I could insert the data into my existing calendar routine. First I had to understand the API Event object. The key items in the API Event are of course the Start and End EventTimeDate objects such as this birthday on my calendar:

Birthday Entry

Start.Date has the year 2017 which is the year I entered this into Google Calendar. It doesn’t really matter because all we are interested in is the June 14. This is a recurring event that occurs every year, so I just take the June 14 and combine it with the target year to create a DateTime object in my new CalendarLib object with a time of 12am so it will appear first when sorted. Also note here that Start.Date.DateTime is null. You don’t need a time component to a yearly date. Next look at the calendar entry for a symphony concert:

Symphony Concert Entry

Here the concert starts at 8pm so there is an Start.EventDateTime.DateTime object, with the date and 8pm. So all I have to do is copy this DateTime into my new CalendarLib object. I also copy the End.EventDateTime into my new object, it has the same date and 10pm.

The Summary field is the name of the event, I copy that along with the HtmlLink which allows me to create link to the calendar entry on the EverLog note. I also copy a lot of other fields that are currently unused. I then create the calendar text starting with the first day of the year. For each day I seach my collection of CalendarLib objects for a matching date and add any that are found to that day’s entry. Here is a sample of a day with a holiday, a birthday, and a party event on the same day.

Christmas Eve 2019 in EverLog

When you click on the Xmas Eve Party link a Google Calendar page is opened with a popup dialog of the event:

Christmas Eve in Google Calendar

If you would like just the EverLog end product you can find the Evernote export format file EverLog2019.enex along with the updated source code in the Github repository. This file just has the dates and US Holidays.

Installing Azure CLI Using CURL

Overview

In preparation for some upcoming work using the Azure platform I took the Pluralsight course Developing with Node.js on Microsoft Azure - Getting Started by Scott Allen. In the course the Azure portal is used for many of the configuration and administration tasks but Scott also demonstrates the use of the Azure Command Line Interface. I installed the Azure CLI but had a lot of trouble on the way.

The AZ not found problem

The first step is to Install Bash - Ubuntu on Windows which is fairly easy to accomplish. Next I used the instructions at Install Azure CLI 2.0 for the CLI install. I ran all the scripts in the section ‘Windows -> Bash on Ubuntu’ under the Windows section:

1
2
3
4
5
6
echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ wheezy main" | \
sudo tee /etc/apt/sources.list.d/azure-cli.list

sudo apt-key adv --keyserver packages.microsoft.com --recv-keys 417A0893
sudo apt-get install apt-transport-https
sudo apt-get update && sudo apt-get install azure-cli

There is a warning to run the last command twice, which I did. But when I tried to run “az –version” I would get a message the az command is unknown. I did a lot of searches and tried a lot of fixes from Stack Overflow and other sources to no avail.

The page Failure during installing the azure-cli 2.0 in Bash on Ubuntu on Windows #1075 instructs you to run the following commands before installing the CLI on Bash for Windows.

1
2
3
sudo apt-get update
sudo apt-get install -y libssl-dev libffi-dev
sudo apt-get install -y python-dev

Not being sure about the last command, I also ran:

1
sudo apt-get install -y python3-dev

But trying the original install scripts again did not work.

Piping CURL into Bash - A victim-less crime

Then somewhere, and I can’t pin down where, I saw this command for the install:

1
curl -L https://aka.ms/InstallAzureCli | bash

There is an entry under ‘Errors with curl redirection’ on the page Install Troubleshooting that gives a workaround if the command does not work. Running the curl command did the trick. It was obvious that this was different because there were now user prompts not present in the original install I tried:

1
2
3
4
5
6
7
===> In what directory would you like to place the install? (leave blank to use '/home/fredwebs/lib/azure-cli'):

===> In what directory would you like to place the 'az' executable? (leave blank to use '/home/fredwebs/bin'):

===> Modify profile to update your $PATH and enable shell/tab completion now? (Y/n): y

===> Enter a path to an rc file to update (leave blank to use '/home/fredwebs/.bashrc'):

In trying to find the source of this command I pulled up a few pages of strong opinions about using CURL and Bash with a pipe. One of these had the title about the victim-less crime. As Ford Prefect told Arthur Dent once, “Don’t knock it, it worked.”

TeamCity Setup for Web API Application

Overview

I was recently tasked with setting up a continuous integration build on a development server for a Visual Studio 2015 Web API solution. I had a little experience using Jenkins but had never attempted to set up such an application. But it was just a single application deploying to the same server. How hard could that be? I quickly found out. In all of my web searches I found maybe a couple of end to end articles but they proved to be incomplete and applied to older versions of TeamCity. The TeamCity documentation is very though but I still struggled with my limited DevOps experience. I can count only a few times I had ever even used MsBuild in my decades of Microsoft experience.

Jet Brains describes TeamCity as “Powerful Continuous Integration out of the box”. The latest version is 10.0 and the hefty 900 MB plus Windows installer may be downloaded here. The Professional version is free and offers the use of up to three build agents (MsBuild, script, NuGet, etc.) and twenty build configurations.

Installation

The installation is pretty straightforward and there were no problems. Be aware that the installer might put the current Windows user account in a Windows reporting user group. I could not determine if TeamCity was responsible for this and we removed the account from the group since we received a security alert about this. TeamCity seems to operate just fine without this group membership.

You will need to create a database for the system to use. We created an empty “TeamCity” database in SQL Server. You will be prompted to enter the connection information along with a SQL account and password. For SQL Server the JDBC drivers must be installed on the server. The JDBC driver package gives you two versions. You will have to tell TeamCity which one to use. I found the 6.0 version had no problems. MySQL and other databases are supported as well.

Important File Locations

With any such large complex application files are created and stored in various places. Here are some of the important locations for our installation. Note that you will specify the TeamCity Installation location during the install:

Name Description
TeamCity Installation E:\TeamCity
Application Data C:\ProgramData\JetBrains\TeamCity
Repository Location E:\TeamCity\buildAgent\work\{generated Id}\
The source, libraries, and compiled files for the project will be here.
NuGet packages folder will be created and loaded at the level of the solution file if you use a NuGet agent in TeamCity.
Windows 10 SDK Installer at https://developer.microsoft.com/en-us/windows/downloads/windows-10-sdk
This would not install on our server so I copied the files on the next line from my dev machine to same folder on the server.
SDK Files For MsBuild version 14 this will be “C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6 Tools\”.
JDBC Files C:\ProgramData\JetBrains\TeamCity\lib\jdbc
Copy the files in sqljdbc_6.0\enu\jre8\sqljdbc42.jar to this folder
Microsoft Build Tools 2015 Installer at https://www.microsoft.com/en-us/download/details.aspx?id=48159
Visual Studio Files C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0
Copy these files from your dev machine to the same location on the server
NuGet Executable https://dist.nuget.org/index.html

Setting up the root project

The first step is to set up your root project. On this page I only used the General Settings and VCS Roots menu items.

Under VCS Roots you create a connection to source control. This was very straightforward. You might want to create a new login for use by TeamCity. It appears it only needs read only access. Note that the Template below is not used in our setup but could be used to simplify the setup of multiple similar builds.

The Root Project

On the General tab click the Create subproject button and give it a Name, a Project Id, and a description as show in this edit view of my subproject.

After you have created your subproject click on the Create Build Configuration and give it a name. I used “Standard Build” here. When it is created you click on the edit link by the Build Configuration (WebSvcBuild here). You then get a menu for the build steps.

Sub-Project Edit View

Build Configuration

General Settings

On this page enter a build name. Here I used “StandardBuild”. TeamCity creates the Build Configuration ID for you, which can be changed. I used the defaults for the rest of the page.

General Settings

VCS Roots

Here you need to click on the Attach VCS root button to create your VCS Connection.

VCS Roots

Edit VCS Roots

Here is the edit view of the VCS root that is shown when you click the Edit link next to the VCS root name. Once again choose a name then enter the location and credentials for your VCS.

Edit VCS Root

Build Steps

Here is where the build and deploy are defined, and where I spent most of my struggles. You may use up to three build agents with the Professional version.

There is a NuGet runner you can use to retrieve your packages. The files will be placed in the packages folder at the same level of the solution. I had problems getting all my packages since we have an aggressive firewall which caused timeouts on various packages. I reverted to copying my packages from my development machine. This was just as well since I could find no way to direct the packages to our standard lib folder that is at a level above the solution. Note that you will have to copy NuGet.exe to the server and tell TeamCity where it is.

For a .Net application there is Solution runner type. I began using the Solution runner to compile my project and added a Powershell script runner to copy the compiled files to the target IIS folder. This seemed to work fine until I got a 404 for every call made to the deployed site.

Another investigation found that two key files were missing:

  • App_global.asax.compiled
  • App_global.asax.dll

These files are created when you build with a Publish profile. I could not make the Solution runner create the Publish profile so I reverted to using the MsBuild runner. I developed the build parameters and publish profiles on my dev machine before trying them in TeamCity. The publish profile is stored at

  • {solution level}\{WebApiProject}\Properties\PublishProfiles\MyServicesPublish.pubxml.

These are the MsBuild parameters that worked on my dev machine:

  • /p:DeployOnBuild=true
  • /p:PublishProfile=MyServicesPublish
  • /p:AspnetMergePath=”C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6 Tools\”

However when I used these parameters in TeamCity I got this error: “Can’t find the valid AspnetMergePath”. Thus began another two hour investigation. The merge path specified was there. After trying many things and making numerous web searches I still got this same error message.

By chance I noticed the error message had a single quote and double quote at the end in what I thought was the wrong order. I changed the double quotes to single quotes on the AspnetMergePath parameter, same error. I guess it was 1995 when Windows allowed you the luxury of spaces in file and folder names. In desperation I copied the contents to a new folder:

  • {TeamCity Install}\bin\NETFX4.6Tools.

Most any folder will do just get rid of the spaces in the last folder. Without the spaces in the path I removed the quotes from the AspnetMergePath parameter. After this my project ran without errors.

Build Steps

MsBuild Step

Here is my completed MsBuild page. Note that the Build file path is always relative to your root source control folder. The Targets has Rebuild and Publish to ensure App_Global files are created. My Publish profile file has the path to my IIS folder. This eliminates having a script runner that copies files from the VCS location and saves you from using one of your three runners.

MsBuild Step

Triggers

You add a trigger for the build here with the Add new trigger button.

Triggers

Edit VCS Trigger

This is the edit view of the trigger configuration which builds the system whenever there is a check-in.

Edit VCS Trigger

Agent Requirements

I did not have to change anything on Agent Requirements but wanted to point out that you can check here to see that MS Build Tools are installed correctly. Here its condition is “exists”.

Agent Requirements

Keeping a Daily Log in Evernote

For the last few years I have been keeping a daily log of events and information in a daily log using Evernote. In this article I will describe how I use a daily log in Evernote and the code I used to build a journal template.

Using a Daily Log

My daily log is a place where I record notes on programming, applications, machine, and other information. Updates to applications, error reports, meeting notes, code snippets, and links to articles I would like to read later are all kept in my log.

I record events by the date which gives some order to this jumble of information. Also when tracking errors down it is good to have a date associated with their appearance.
Despite the dated entries I don’t really use it as a calendar or planner. However on occasion I will put an entry in a future date about some deadline or task to perform. But mostly it is for a record of what happened when on what project, program, or machine.

I keep one log per year. Although you can search over all your notes in Evernote I find it useful and faster to search within a note if I know the data or event was in a particular year. Also when you first bring up a note you are at January 1, so you have to page down to the current date to start a new entry for the current day.

Advantages of Evernote

Evernote is an online service for documents, images, web pages, and voice notes. Its greatest advantage is that your information is available via several means. The Evernote website has a sophisticated web application to view and edit your notes. There are applications available for Windows, Mac, iOS, and Android. The basic service is free and the paid service adds things like performing OCR on images and adding the words to the search index. There are also Chrome and Firefox add-ins that allow you to save a partial or entire page.

The majority of my Evernote usage is by the Windows application. The data is stored locally and synced with the website at an adjustable period. Notes are edited in a rich text editor that supports different fonts and colors, indents, bullet lists, and tables. Formatted data pasted in will retain the format of the original document.

Keeping the Log

When I started the log I kept a weekly template with horizontal rules to divide the daily entries and put a partial day/date at the top of each section. That meant every Monday morning I would have to copy this template, paste it into the end of the log, and then change seven strings like “2016.mm.dd Monday” to “2016.12.26 Monday”.

I got pretty tired of doing this and last year I started to investigate how to create a template that would suit my needs. Unfortunately paid programming work got in the way and I had to put the project aside. Also I found a 2016 calendar template on the Evernote site. I managed to edit this calendar into a usable log template. It had several drawbacks. The major one was that it was one big table so that limited how you could format things since one day was in one table cell. Also when you did an undo the document would zip back to the top, very frustrating after about April or May!

Creating a Template

This year I was determined to revive my custom template project and have a new template in place on January 1.

My original idea was to have a program create the date text and formatting data and save it into the Windows clipboard for easy pasting into Evernote. There is an Evernote XML export / import format (.enex) but I decided not to dig into that. Besides, how hard could it be to write formatted data to the clipboard? The application source is available here in Github.

The Evernote Log Application

Reading the Clipboard

First I created the function GetClipboardInfo to list the formats in the current clipboard entry and list the formats it contains.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
public static StringBuilder GetClipbordInfo()
{
var sb = new StringBuilder();
var cbDataObj = Clipboard.GetDataObject();

if (cbDataObj == null)
{
return sb;
}
var fmts = cbDataObj.GetFormats();
sb.AppendLine("Data object formats: ");

foreach (var t in fmts)
{
sb.Append("\"");
sb.Append(t);
sb.AppendLine("\"");
}

After listing the formats I then check to see if an HTML Format object exists. If it exists we read it with the GetData function. The Replace function is to change the new line character to a carriage return / new line so the Winforms text control will display the line breaks.

1
2
3
4
5
6
7
8
9
10
// HTML Object
if (!cbDataObj.GetDataPresent("HTML Format"))
{
sb.AppendLine("No Html object");
}
else
{
var doc = cbDataObj.GetData("HTML Format");
sb.AppendLine(doc.ToString().Replace("\n", "\r\n"));
}

The “Version:0.9” is the beginning of the HTML Object followed by markers for the positions of the start / end of the actual HTML and the Fragment, which is what was copied. By copying parts of an Evernote note and using this tool I was able to determine what to put in the generated template.

The class GenEverLog has a single function CreateYear with a single parameter for the year to create. The HTML strings are divided between the sections before and after the date strings to be printed. The format strings are used for date formatting. These could be made variable with control settings added to the form. Also the hard coded colors in the HTML strings could likewise be made select-able.

1
2
3
4
5
6
7
8
9
10
11
private const string HtmlStart = "<span><hr/><div><span style=\"color: rgb(54, 101, 238);\"><b>";
private const string HtmlEnd = "</b></span></div><div><br/></div><div><br/></div><div><br/></div></span>";

private const string HtmlWeekStart = "<span><hr/><div><span style=\"color: rgb(123, 0, 61);\">=== <b>Week ";
private const string HtmlWeekEnd = "</b> ===</span></div><div><br/></div></span>";

private const string HtmlStartMonth = "<span><div><span><br/></span></div><hr/><div style=\"text-align: center\"><span><b><span style=\"color: rgb(50, 135, 18);\">";
private const string HtmlEndMonth = "</span></b></span></div></span>";

private const string MonthFormat = "MMM yyyy";
private const string DayFormat = "yyyy.MM.dd ddd";

From this point it is a simple matter to create a DateTime object, increment the day in a while loop, and build up a StringBuilder object with the HTML and the dates, months, and weeks. Note that a companion text only string is built up and will be sent to the clipboard. This text only string will be retrieved by applications that do not support HTML, such as Notepad.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
public static Tuple<StringBuilder, StringBuilder> CreateYear(int calendarYear)
{
var dateIncr = new DateTime(calendarYear - 1, 12, 31);

var year = calendarYear;
var lastMonth = 0;
var week = 0;

var sbHtml = new StringBuilder();
var sbText = new StringBuilder();

while (year == calendarYear)
{
// Increment date
dateIncr = dateIncr.AddDays(1);
year = dateIncr.Year;

if (year != calendarYear)
{
break;
}
var month = dateIncr.Month;

if (month != lastMonth)
{
lastMonth = month;
// Do month header
sbHtml.AppendLine($"{HtmlStartMonth}{dateIncr.ToString(MonthFormat)}{HtmlEndMonth}");
sbText.AppendLine($"{dateIncr.ToString(MonthFormat)}");
}
// Week entry
if (dateIncr.DayOfWeek.Equals(DayOfWeek.Monday))
{
sbHtml.AppendLine($"{HtmlWeekStart}{++week}{HtmlWeekEnd}");
}
// Do date entry
sbHtml.AppendLine($"{HtmlStart}{dateIncr.ToString(DayFormat)} - ({dateIncr.DayOfYear.ToString("D3")}){HtmlEnd}");
sbText.AppendLine($"{ dateIncr.ToString(DayFormat)} - ({ dateIncr.DayOfYear.ToString("D3")})");
}
sbHtml.AppendLine($"{HtmlStartMonth}End of {calendarYear}{HtmlEndMonth}");
sbText.AppendLine($"End of {calendarYear}");

ClipboardHelper.CopyToClipboard(sbHtml.ToString(), sbText.ToString());

return new Tuple<StringBuilder, StringBuilder>(sbHtml, sbText);
}

To write the HTML to a new clipboard entry, I used the ClipboardHelper code from this article by Arthur Teplitzki. I had to clean up the code copied from the web page to change Word style double quotes to regular quotes and while I was at it I changed the style to use newer C# conventions. The only functional change I made was to remove the insertion of the Doctype line.

The “Write Everlog to Clipboard Function” button will call the CreateYear function and display “Added HTML to Clipboard”. If you want to see the HTML created, simply click the “Read Clipboard” button again.

Viewing the created clipbord entry

If you are only interested in the template, the Fredwebs2017.enex file can be imported directly into Evernote. The file is included in the Github repository or download directly here.

The top of the 2017 log

Bing Maps Using Web API

Bing Maps recently retired their SOAP web service interface. The new interface is a REST service and the JSON Data Contracts define the response interface. There is a sample program Parsing REST Services JSON Responses that I used as a starting point for my code which is located here.

Bing Maps Key

To use the Bing Maps interface you will need a key. See Getting a Bing Maps key if you don’t have one already. In my code I read the key from the environment variable named “BingMapsKey”:

1
_bingMapsKey = Environment.GetEnvironmentVariable("BingMapsKey");

Building and sending the request URI

The base of the URI for all requests is:

1
private const string BingRestLocation = "http://dev.virtualearth.net/REST/v1/";

With this base we add “Locations” then the search string and add the key at the end. Be sure to use WebUtility.UriEncode on your location search.

1
var urlRequest = $"{BingRestLocation}Locations/{place}?key={_bingMapsKey}";

Next we make a call with Web API to the service using the MakeRequestWebApi function with “New York” in the place string. The Tuple that is returned has a status string and Bing Response object.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
public static Tuple<Response, string> MakeRequestWebApi(string requestUrl)
{
var httpResponseMessage = Client.GetAsync(requestUrl).Result;

if (!httpResponseMessage.IsSuccessStatusCode)
{
return new Tuple<Response, string>(null, $"Response Status: {httpResponseMessage.StatusCode}");
}

var jsonString = httpResponseMessage.Content.ReadAsStringAsync().Result;

using (var ms = new MemoryStream(Encoding.Unicode.GetBytes(jsonString)))
{
var deserializer = new DataContractJsonSerializer(typeof (Response));
return new Tuple<Response, string>((Response) deserializer.ReadObject(ms), "success");
}
}

Parsing the return

Once we get a Response object back, we have to cast to the object that we requested, in this case a Location. The other objects are Route, Traffic Incident, CompressedPointList, ElevationData, or SeaLevelData.

The function ProcessLocationResponse is from the original program and shows the locations found with high confidence and their geocode points.

Retrieving a Route

I also needed to get route instructions from the Bing Map interface. I added a Location search for three more specific addresses to make a cross country musical journey from the Brill Building in New York to the Whiskey A Go Go in LA with a stop by the Stax Studios in Memphis.

1
2
3
const string brillBuildingAddress = "1619 Broadway New York NY 10019";
const string staxStudiosAddress = "926 E McLemore Ave, Memphis, TN 38126";
const string whiskyaGoGoAddress = "8901 W. Sunset Blvd West Hollywood, CA 90069";

The Route parameter requires at least two waypoints in the request. The MakeWayPointString takes a List of Locations and builds the string.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
public static string MakeWaypointString(List<Location> waypoints)
{
var waypointsSb = new StringBuilder();
var waypointCntr = 1;

foreach (var waypoint in waypoints)
{
waypointsSb.Append($"wp.{waypointCntr}=");
waypointsSb.Append($"{waypoint.Point.Coordinates[0]},");
waypointsSb.Append($"{waypoint.Point.Coordinates[1]}&");
waypointCntr++;
}
return waypointsSb.ToString();
}

The ProcessRouteResponse parses the route information and shows the instructions and coordinates of each itinerary item in both legs of the trip.

InstallShield Limited Edition

A recent client project brought me back into the world of Windows WPF applications. The task was to update some very old third party controls and to change Bing Map service calls to the Bing Maps REST API from the disappearing SOAP API.

Updating a six year old project

Naturally a new install program would be needed. The code I was given was in a Visual Studio 2010 Solution with a Visual Studio Installer (.vdproj). This installer type is no longer supported in Visual Studio 2015. There is a Visual Studio add in that supports this old format but I wanted an up to date solution. Since I had already spent significant client funds for software and hardware I decided to try the InstallShield Limited Edition that is licensed free with Visual Studio.

Free software - you get what you pay for

I was able to get an installer build that did work. However the interface is extremely clunky and you are never quite sure when the installer build is actually running. Files you add in by mistake cannot be removed from the build list, you have to uncheck the check box and hope you don’t check it later. A couple of times the InstallShield process killed Visual Studio and required a visit to the Task Mangler to kill the whole thing. The project had some large data files which took InstallShield an incredible amount of time to compress.

Due to the half hour required to build the installer, I uninstalled the installer project from the Visual Studio Solution until I needed to build a new installer. The very first time I attempted to reload the installer project in the solution, the InstallShield installer popped up and informed me that it needed to make changes to continue. I reluctantly gave it the OK. It churned away for a few minutes and then requested a reboot. After the reboot I tried again to reload the installer project. At this time Visual Studio informed me that this type of project is no longer supported and would not load it.

Changing to a better installer

I purchased Deploy Master for $99. It is from Just Great Software whose products I have enjoyed using for years. It took a couple of hours to learn the new program but it did the job superbly. You can easily make changes, it runs outside of Visual Studio, and the installer built in about five minutes versus the half hour with InstallShield. And as a bonus the resulting installer file uploads to web storage much more quickly since it is only 238MB versus 530MB.

Moving a Blog to Hexo

I started blogging using a self hosted version of a platform called Das Blog. It had two major attractions for me. It was an open source .Net application and it stored everything in XML files. Just ten years ago hosting was more expensive and often you were allowed only one database so the XML solution had an appeal. It worked well for years but it took a lot of time and headaches to update the blog when a new version of Das Blog was released.

After a few years of Das Blog operation I really didn’t have much on my blog. I incorrectly assumed that I would write more for the blog if I wasn’t spending time to update it. So I moved to a hosted WordPress solution.

A few more years down the line I took a Pluralsight course titled Build a Better Blog with a Static Site Generator by Jeff Ammons. The course description said you could make your blog load faster with a static website generator without resorting to hand written HTML. I’m bracket adverse so this sounded good to me. The course is great and inspired me to leave the WordPress world behind.

The course covers two different blogging platforms, Hexo and DocPad. Hexo is specifically designed for blogs and is simpler to implement so I went with that solution. Hexo is a node.js application that compiles a few configuration and template files, along with your pages written in Markdown text. The output is placed in a single folder which can be copied to any web server since it is only HTML and JavaScript. In the course Jeff shows how to update your site using Git. WordPress has an export function that creates Markdown files which I easily updated for the new site.

I’ll have to wait and see if actually blog more with Hexo but one bonus about the two conversions I have done is that each was a great opportunity to clean out outdated posts.

2015 France Photos

First days in Paris

Tuesday Afternoon

Wednesday on the Left Bank

Friday in ParisThe 4th arrondissement of Paris

Drive to Breuil

Château de Chantilly

Sunday Brocante (flea market) at Neuilly St. Front

La Ferte-Milon

In and near the country house

Sunrise at the Chateau

Reims Cathederal

Reims and the Ruinart Champagne Tour

The canal at La Ferte-Milon

Marolles

Cointicourt and the walk home

The Ruins at Fere-en-Tardenois

The Oise-Aisne American Cemetery and Memorial

The City of Fere-en-Tardenois