Seven years an Apple user, but no more

I switched to Apple in 2013 back when I grew weary of the never ending and stupid changes in Linux desktop environments; in particular, Ubuntu’s switch to Unity (which didn’t stand the test of time, I see). The one thing back then which its desktop environment (GNOME 2.x) did for me, at least, was prepare me for OS X (now macOS). Its file manager was about as useless as the Mac Finder, so that got me used to not expecting much in that department, and it was generally in the “less is more” camp with Apple. Before that I used various KDE-based Linux distributions, but eventually I gravitated to Ubuntu as I began to be less fussy about things such as useful default file managers.

My Late 2013 15″ MacBook Pro (a.k.a. “Ol’ Bessie”) is starting to struggle, albeit mostly when I want to run Parallels to fire up a Windows 10 virtual machine in order to work from home. If it wasn’t for having to do that, I’d probably be a happy little camper for the most part. But the noisy fan under heavy loads (not just when using Parallels), and hearing about all the problems with Apple computers getting rather noisy and too hot, means that I just cannot bring myself anymore to pay top dollar for inferior hardware.

I briefly considered getting a Mac Mini (like I did originally back in 2013 before returning it for a refund because of video issues) but they’re also cramped little boxes which get too hot and have noisy fans under load. There’s always an iMac but I already have a Dell 27″ monitor which I’m quite happy with, and I’m reluctant to pay extra for a screen that I don’t need. I’m also not paying A$1,050 for a 2 TB SSD, which I’m pretty sure wouldn’t be a Samsung 970 Evo Plus!

The problem with Apple’s hardware is mostly their own fault because of its obsession with reducing size at the expense of adequate cooling. However, Apple is also flogging a dead horse by stubbornly remaining with Intel, whose mobile offerings are now struggling to match AMD. This guy says it best:

AMD’s Precision Boost, unlike Intel’s Turbo Boost, does not have a duration limit. So, much like a GPU, Ryzen third-gen will keep boosting for as long as it can within its thermal and power budgets. In other words, these things are designed to redline.”


“…for the first time in over a decade, AMD is essentially on top of the consumer CPU race, and much like in the Athon 64 days, Intel has no meaningful response on their roadmap…”

A friend of mine at work said that his daughter recently bought a 13″ MacBook Pro and commented on how hot it can get, too. I mostly run my laptop in “clamshell mode” and don’t actually use it like a laptop 99% of the time, but I still don’t want the thing overheating on my desk. I have the option of salary packaging a laptop, too, which will save me around 40% of the cost, but even if I could save a couple of grand, I still wouldn’t want to buy a MacBook Pro today; the hardware is just overpriced garbage.

So, I’m now contemplating building my own box at a fraction of the cost (prices are from and were current as of 25-March-2020):

  • $349 – AMD Ryzen 5 3600 3.6 GHz 6-core processor
  • $259 – Gigabyte B450 AORUS PRO ATX AM4 motherboard
  • $249 – G.Skill Ripjaws V Series 32 GB (2 x 16) DDR4-3200 memory
  • $218 – Crucial P1 1 TB M.2-2280 NVME solid state drive
  • $145 – Gigabyte GeForce GT 1030 2 GB Silent low profile video card
  • $169 – Corsair RM550x Gold 550W power supply
  • $109 – Corsair 110Q Mid-Tower Quiet ATX case
  • $25 – LG GH24NSD1 24x SATA DVD-RW drive
  • $1,523 in total

I’ll sleep on it some more and see if Apple miraculously comes out with a decent mid-level computer that’s not a compact little furnace. I may be waiting a long time for that day to come, sadly.

Besides, it will be easier returning to Windows after all these years, having walked away when Windows Vista came out. Microsoft have lifted their game considerably since then, and I’m quite happy with it at work. There’s even a Windows Subsystem for Linux, Internet Explorer is almost dead and buried, and pretty soon there will be an equivalent to Spotlight, which I couldn’t live without. Things are looking pretty good for Windows users now, and I’m looking forward to coming back to my first operating system.


Developing in React with an API hosted in IIS

The Problem

I’ve recently decided to embrace React for my front-end development, having previously used Sencha Ext JS for many years, but returning to jQuery this past year. The idea of using npm and webpack to build a web site once gave me the heebie-jeebies, but without it, you end up with the sprawling and unmanageable mess that I created with jQuery, which I did in my haste to move away from Ext JS for a particular project. I still like Ext JS but, following some hefty price increases and uncertainty following a change of ownership, the wisdom of an exit strategy became apparent.

So, now that I’ve chosen React, how do I get the webpack-dev-server running on port 3000 to play nice with my existing .NET web service running in IIS on port 80? I searched but couldn’t find a way to get my React app — with hot module replacement — running within IIS. I once inherited a VB.Net web app (urgh!) that took 30 seconds to rebuild every time I had to test a change in the browser, and I wasn’t going to go through that nightmare again with “react-scripts build” (whose output, of course, does play nicely with my existing API when running from within IIS).

Instead of trying to get the webpack-dev-server (via “react-scripts start”) to talk to a “foreign” API (on a different port), why not try and approach things from the other end?

The Solution

The easiest way to make my c# .NET WCF web service play nicely with React on port 3000 is to:

  • Enable CORS
  • Use identity impersonation
  • Enable anonymous access

This way, I can take advantage of the hot module replacement updating my browser instantly whilst still using my API with its need for windows authentication retained (via impersonation). Here’s what you need to do…

Add the following three lines (in bold) to the web.config file at the root of your web service:

<?xml version="1.0" encoding="UTF-8"?>
    <identity configSource="IdentitySecrets.config" />
            <anonymousAuthentication enabled="true" />
        <add name="Access-Control-Allow-Origin" value="http://localhost:3000" />

If you’re happy to keep your password as plain text in the main web.config file, you could just use this line instead of the one above:

<identity impersonate="true" password="whatever" userName="domain\me" />

But because I don’t like that idea, I’m shifting the configuration of the identity impersonation to a separate file, firstly so that I can encrypt it without cluttering up my main web.config, but secondly so that I can exclude it from version control. In order to encrypt the identity tag, create the following file in a temporary directory and call it web.config:

    <identity impersonate="true" password="whatever" userName="domain\me" />

Now open a command-prompt as Administrator and navigate to that directory. Find the appropriate version of .NET for your web service then execute the following command to encrypt the file (the period at the end is important; it denotes the current directory where the web.config file is located):

C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_regiis.exe -pef system.web/identity .
Administrator command-prompt window

If you encounter any problems trying to encrypt the file, see if this article is of any help. You should now have an encrypted version of the web.config file, which will look similar to the picture below:

Web.config file showing encrypted values after running aspnet_regiis.exe

You will need to open the newly-encrypted web.config file in a text editor and remove the <configuration> and <system.web> tags, since the configSource attribute on the <identity> tag expects an <identity> tag to be the root element in the file (but aspnet_regiis.exe won’t encrypt it without the file looking like a proper web.config). After removing those tags, save it as IdentitySecrets.config in the same location as the web.config of your web service.

With identity impersonation and CORS now configured in your web service, you should be good to go. Rebuild the web service and then make a call to http://localhost/myapi from your http://localhost:3000 React app, and it should work nicely.

Minor Caveat

The only tiny problem with this approach is that you will get errors like this when navigating into the authentication node of the web service in the IIS Admin console:

IIS Admin console error message

When you dismiss the error message, the authentication section looks like this, with “Retrieving status…” in the status column for all authentication types:

IIS Admin console showing the Authentication section

The web service continues to operate just fine, but the admin console can’t handle the encrypted identity impersonation. If you need to make other changes to the authentication settings through the console, just remove the <identity> tag from the web.config, make the changes, then put the <identity> tag back when you’re done.

Finishing Up

One last thing that you should probably do is make some changes to your web.config transformation files (if you have them) so that the identity impersonation, anonymous access, and CORS changes are not deployed to your test and production servers (which will use the built version of your React app):

<?xml version="1.0"?>
<configuration xmlns:xdt="">
    <compilation xdt:Transform="RemoveAttributes(debug)" />
    <identity xdt:Transform="Remove" />
            <anonymousAuthentication xdt:Transform="Remove" />
        <add name="Access-Control-Allow-Origin" xdt:Locator="Match(name)" xdt:Transform="Remove" />

If you’re in a team of developers and they don’t wish to use identity impersonation in their development copies of the web service (and also don’t want to keep commenting out the <identity> tag each time they do a git pull or svn update), they can just create an IdentitySecrets.config file as follows:

<identity impersonate="false" password="whatever" userName="nobody" />

Enabling anonymous authentication (so that React can access the API unhindered) may also mean you have to change the way you’re identifying users in your web service. Instead of using System.Web.HttpContext.Current.User I had to change to using System.Security.Principal.WindowsIdentity.GetCurrent() (which seems to identify the current user correctly, whether I’m using anonymous+impersonation or Windows authentication by itself).


Oracle SQL Developer’s Laziness

Come on, Oracle. Is it that hard to scan a directory to look for one of your own products?

Oracle SQL Developer first-run screen asking for path to JDK

The path I gave it was “C:\Program Files\Java\jdk1.8.0_152”.

Was that so hard? If you’d like some help in looping through a directory, give me a call.

This is why Java gets such a poor reputation with some people. Its developers are just bone idle and couldn’t be bothered to take care of something so fundamentally basic.

Java could be so much better than this.

MP3 Track Order Nightmares on MZD Connect

I have a 2015 Mazda 3 SP-25 which has been driving me nuts for some time now. Apart from its frequent habit of forgetting which track I was listening to whenever I start the car, it has been playing tracks in the wrong order this past year. I’ve only just now finally figured out why it’s playing tracks out of sequence.

The TL;DR (too long, didn’t read) summary is basically this: macOS somehow copies files to FAT32 volumes that won’t be listed in the correct order on other operating systems. This must have been a regression in recent macOS versions because I don’t recall having this problem a few years ago.

Here’s an example of an album that was copied to a FAT32 volume on macOS 10.13 “High Sierra”:

DOS window showing files in the wrong order

By using the command “dir /x” you can see the “8.3” compatible filenames listed as well, and you can see they are apparently fine, yet the files are not shown in the correct order.

If you copy this folder using a Windows 10 computer to the Desktop, delete it from the USB stick, then copy them back to the same USB stick, the files will be shown in the correct order:

DOS window showing files in the correct order

Here’s what the album copied by a Mac looks like in the Mazda’s MZD Connect entertainment system:

MZD Connect screen showing files in the wrong order

And here’s what the album copied by Windows looks like:

MZD Connect screen showing files in the correct order

I initially suspected randomly generated backwards compatible “8.3” filenames as the culprit, but they seem fine, so just how the files are not showing or being played in the correct order is still a bit of a mystery. The blame is mostly attributable to Apple for somehow writing files to FAT32 volumes in such a way that they are not read in the correct order. It would be nice, though, if Mazda could sort the file names alphabetically when MZD Connect reads the directory listing instead of just trusting what it gets from the file system.

So, at least I now have a solution to this constant source of irritation -i.e., copy all the files off the USB Stick using Windows and copy them all back again. At least Windows can be trusted to write files to a file allocation table that Microsoft (and others) invented. Apple obviously doesn’t give a toss, and unordered filenames is the penalty we must pay for flirting with a foreign disk format, sadly.

One thing is for sure, if/when I buy a new car, I will be making sure the entertainment system knows how to play tracks in the correct order. I’ll create two folders, one copied on the Mac and the other on Windows. Bonus points for any car which can sort filenames alphabetically/numerically in the dodgy Mac folder. Extra bonus points for any car whose entertainment system doesn’t forget which track number I’m up to when I turn off the car.

Oh, and my next car has to either come with no i-stop feature or have a master switch – that I only have to press once – to disable it! I’m so sick of my Mazda turning off the engine when I put the foot on the brakes! It might be a handy thing to have on all the time if you lived in a large city with traffic jams all the time, but not where I live.

The Web is a great, big, polished turd

Web development isn’t fun anymore. It used to be that you picked a server-side language, typed up some human-readable HTML, sprinkled it with a little bit of JavaScript to improve the user experience, and your job was done. These days it has become an all-you-can-eat buffet of overlapping frameworks, many of which don’t play nicely together, which you are almost blackmailed into using in order to fix problems or deal with a particular web browser, that is, if you’re not lucky enough to use something like Ext JS which takes care of all these hassles.

This past couple of weeks I’ve had to research using web components for a new project. “Keep calm and #UseThePlatform” it says on their web site. So what are they?

Web components are a set of web platform APIs that allow you to create new custom, reusable, encapsulated HTML tags to use in web pages and web apps. Custom components and widgets build on the Web Component standards, will work across modern browsers, and can be used with any JavaScript library or framework that works with HTML.

Sounds promising. Web components are not like your dime-a-dozen widget library and are supposed to be the future of user interfaces on the web. So I read some more and pretty soon it becomes clear that Polymer is probably the right way to go. Somewhere along the way I came across somebody who said that Polymer is to Web Components what jQuery is to JavaScript. It basically makes it easier to write Web Components and cuts down on all of the boilerplate code you have to write; that’s all I need to know.

The use case that I was given involves a simple web service call to search for records and display them in a grid. Easy. I’ve been doing that kind of stuff for almost two decades now, so this shouldn’t be much trouble at all. By a country mile, the only grid worth using if you want to create a custom element (another term for these things which means that I’m creating my own, reusable HTML tag, such as <my-awesome-tag>) is the one from Vaadin. This blog post is already becoming a rambling mess, so I’ll cut to the chase here and say that I couldn’t get version 2 of their grid to work with my Polymer 2 custom element, so I had to install version 3.0.0-beta1 (which works perfectly fine except that it hangs Internet Explorer 11 if you tab into the grid; I can see that beta1 has been pulled and only the 3.0.0-alpha2 version is available again, maybe because of this bug?). No biggie, they’re bound to fix it eventually which shouldn’t stop me from continuing my research.

Within a few days I created a magical web component which fired a custom event when the user selects a row (so that anybody who uses <my-awesome-tag> can add an event listener to handle it). I also set up data binding between my element and an input box which automatically sends requests to my web service whenever the query parameter changes, thereby updating the grid with new records seamlessly. Wonderful! Web Components really are the future of the web. There’s only one problem – I use Chrome as my web browser because it’s the fastest (especially when it comes to testing apps heavily dependant on JavaScript, such as Ext JS). Does it work in Internet Explorer 11?

No. All this time I had been coding away happily in a version of JavaScript that this nearly four-year-old browser does not understand. Just great! Polymer 2 uses the sixth edition of JavaScript, variously known as ECMAScript 2015 or ES6 [read this if you’re really bored]. If you want to use Polymer 2 components in IE 11 you have to transpile ES6 down to ES5 and serve up the former only to newer browsers (Firefox, Chrome, and Edge 14+) and the latter to all versions of IE and earlier versions of Edge. Yay! I get to experience the delights of browser sniffing (you could just serve the ES5 transpiled bundle to all browsers, including Chrome which supports Web Components natively, but you would need to include the custom-elements-es5-adapter.js shim, which just seems nasty; I’d rather do browser sniffing so that Chrome can fly and only IE suffers).

After getting that sorted, I turn my attention to getting this custom element to work from within an Angular app. You could do entire apps in Polymer but that would require a bit more work and Angular had already been chosen as the other framework for this new project anyway (not by me). So I start learning and find that Angular and Polymer both are projects by developers at Google with a lot of similarity, but one is playing the short game and the other a long game. Angular does its own thing whereas Polymer tries to adhere to standards (or specifications which eventually will become standards and supported by all browsers one day).

As I type this rant at home on a Friday night I’m trying to recall the arduous journey I’ve taken this week so that I can fully document the experience for everyone’s amusement. I can’t recall the finer details of the problems I had trying to get my Polymer 2 component to work with Angular 2, but my console was full of indecipherable errors and I soon realised that the two don’t play nicely together; I would need some assistance from @codebakery/origami and would have to upgrade to Angular 4. I suppose I should have started taking notes by this time because the journey was truly becoming tragic, but I couldn’t get Origami to work either, and my search for some other options came up empty handed. I decided to admit defeat and declare that I could not get Web Components to work with Angular.

If I was asked to evaluate Web Components specifically instead of any other widget library then there must have been a good reason, and my research lead me to the view that they might just be the holy grail of user interfaces on the web, so maybe Angular was the problem? I soon found this excellent summary of a talk by Rob Eisenberg comparing AngularJS, Angular 2, Aurelia, Ember, Polymer and React. Rob once worked at Google on Angular but eventually left over disagreements about the direction it was taking (and after my brief exposure, I began to see why: it’s over-engineered and would lock you in to using Angular for everything). The recommendation at the end piqued my interest in Aurelia: it doesn’t get in your way, supports web components, high performance, clean syntax (Angular 2 is just ridiculous!), standards compliant, and great if you’re a good developer.

I saw mention of Aurelia in some tech news article once but dismissed it out of hand, thinking to myself “yeah, yeah, another framework… yawn”. The Quick Start guide is probably one of the best such articles I’ve ever seen and the more detailed tutorial did not disappoint. The page describing its Technical Benefits seemed to tick all the boxes, too. I proposed to my colleagues that we should abandon Angular and use Aurelia instead, and after a little hesitation, they went along with it.

Another day, another framework to learn. Never mind, this one finally does seem like it’s worth my time, so I got stuck in and had a good time learning it. I then try to integrate my Polymer component with it and after a few issues (mainly surrounding me learning a new framework; nothing is ever easy!) it seems to play nicely enough. I was surprised that it seemed to “just work”. Great. Time to test it in Internet Explorer 11. Computer says no and I even served up the ES5 version just to keep it happy. I didn’t realise straight away that Aurelia even had a seemingly well documented article on Integrating Polymer components which I didn’t notice originally (there was a lot of stuff on the Internet, albeit slightly out of date, asking about Aurelia/Polymer integration which suggested it wasn’t officially supported yet, so that was my bad for not thoroughly browsing the docs).

My project directory already had about 120MB in my node_modules and bower_components directories from using npm and bower, but in order to get Polymer to work with Aurelia I had to also start using JSPM. This added another 50MB to my project, thank you very much! I have to call System.import(‘aurelia-bootstrapper’) after the WebComponentsReady event, but my project was created using the Aurelia CLI meaning that my app is started with a simple script tag pointing to scripts/vendor-bundle.js and wasn’t using the System.import method. My attempts to load the vendor-bundle that way didn’t work and when I tried to create a link element in vanilla JavaScript and append it to the DOM myself after the WebComponentsReady event had fired, I got a blank page in my browser and no errors at all. Well, that’s just great. Aurelia should update that Polymer integration document to explain how to kick-start your app if you’re using a production build and not the System.import quick start method.

By this stage the day was getting on and it being Friday, I thought, “enough is enough”. I don’t need this shit. Aurelia is nice and all, but I shouldn’t have to subject myself to this much torture just because Internet Explorer 11 has to be the default browser because of some shitty, over-priced, corporate application which only works if IE is set as the default browser, and of course, asking people to go into IE only for that shitty app and to use Chrome or Firefox for everything else just won’t do, now, will it?!

Sadly, this is what web development has become today:

Peter Griffin fighting with the venetian blinds

Being a web developer in 2017 and having to support Internet Explorer 11 is like a chef having to go dumpster diving for food to cook and being forced to serve it in a dog bowl.

I really don’t care any more. Let’s just use Angular for everything and be done with it. We can’t have nice things if people insist on using a web browser that’s almost four years old.

I’ve had the kind of week that only chocolate and ice cream can fix, and maybe binge watching a new TV series, too; I still haven’t watched the latest seasons of Fargo or Better Call Saul. That sounds like a plan to make everything better :-)