13 Dec 2018

PowerShell Core

I recently read a great post by Paul Cunningham on career advice for IT professionals, and this line stuck with me: ‘Change is a constant’. This really resonated, especially as I belatedly caught up with the recent changes in PowerShell.

I started using PowerShell in 2008, after a more experienced colleague advised me to start spending one afternoon a week learning about new technologies and tools as part of my professional development. This is great advice, by the way, and I would recommend every developer to do it. I also would suggest not asking your manager for permission to do this – if you’re a professional software developer, you need to make time as part of your job to learn new technologies. It is for both for your benefit (professional development) and your company’s (they get a more efficient developer), even if they may not always realise it.

So I started using PowerShell 10 years ago, and I quickly started using it daily, especially as my career started involving more and more SharePoint development, support and administration. However, sometimes I don’t follow my own advice, and I haven’t been keeping up with the major changes happening with PowerShell. It was only when I realised that development had stopped on the AzureRM PowerShell module, and that Microsoft was instead focusing on a new cross platform Az module , that I realised that I had missed a number of major announcements about PowerShell and the direction it was heading. With the release of .Net Core in 2016, Microsoft had started implementing a cross platform version of PowerShell, now known as PowerShell Core 6.0. This was released to general availability back in January 2018. I had missed a major change in an essential developer tool that I use daily.  Change really is constant, and I hadn’t been prepared for this particular one!

The major differences between PowerShell Core and the legacy Windows PowerShell are:

  1. PowerShell Core is cross platform and can be run on Windows, Linux and Mac systems.
  2. PowerShell Core is open source.
  3. Currently, the major breaking changes in PowerShell Core are:
    1. PowerShell Workflows are not available.
    2. The Out-Gridview command is not available in PowerShell Core
    3. PowerShell Core does not support the WMI v1 cmdlets and a large number of other Windows OS specific cmdlets.
    4. While Windows PowerShell ships with the ISE editor, PowerShell Core encourages the use of Visual Studio Code.
    5. PowerShell core is case-sensitive, as it must now run on Unix operating systems.

As PowerShell Core uses the less feature-rich .NET Core and .NET Standard, it currently only offers a subset of the functionality offered by Windows PowerShell. This will change over time as the Powershell Core framework matures, and as more functionality is developed for it. While Windows PowerShell will continue to be maintained (with bug fixes and security updates), there will be no new functionality added to it.

It is pretty easy to get started using PowerShell Core. It can be installed and run alongside your existing Windows PowerShell. I use Chocolatey to install (almost) everything on my Windows PCs, as it allows me to configure automatic updates. To install PowerShell Core using Chocolatey, run:

choco install powershell-core

Once installed, it is worth configuring the following:

  1. Modify your profile. The profile path in PowerShell Core is different to that of Windows PowerShell, and is at C:\Users\{username}\Documents\PowerShell\Microsoft.PowerShell_profile.ps1
  2. Modify VS Code to use the PowerShell Core Integrated Terminal.
  3. Install the Windows Compatibility module to allow PowerShell Core to invoke commands that are currently only available in Windows PowerShell. This will allow you to run existing PowerShell scripts in PowerShell Core without any changes.

As I’m spending a lot of time in Azure, the first new PowerShell Core module I installed was the new Az module, which replaces the older AzureRM modules. Once the module is installed, to ensure compatibility with existing AzureRm scripts you should run the Enable-AzureRmAlias cmdlet, to enable aliases for your existing scripts for the current session only.

8 Nov 2018

PowerShell - Writing to the same line with Write-Host

This is a PowerShell tip that I’m posting for future use. If you want to write information to the same line using the Write-Host cmdlet, you use the –NoNewLine switch:

Write-Host "Countdown - $index seconds remaining" –NoNewline

If you want to overwrite the previous content written, you can use the carriage return character (courtesy of this StackOverflow answer by Dullson):

Write-Host "`rCountdown - $index seconds remaining" –NoNewline

Note, you may need to use spaces to blank out previous content if the previous string is longer than the new content string:

Write-Host "`rCompleted $seconds second countdown.     "

Note, the line above doesn’t use the –NoNewLine switch, so new further content can be written to that line in the console. I’ve used this PowerShell snippet for a simple sleep timer with a visual countdown, when I didn’t want to use the default progress indicator (the Write-Progress cmdlet).

30 Oct 2018

Goodbye to Ubuntu, Hello Windows 10

For the past 18 months, I've been using Ubuntu for my main laptop at home. This is my third attempt to use a Linux distribution as my primary Operating System. However, in the past week, I have went back to using Windows on my laptop. This blog post explains why.

I have been using Ubuntu 16.04 on my laptop, and I have to admit to being impressed initially. Everything. Just. Worked. This was in contrast to previous attempts to using Linux, which had failed due to repeated hardware (driver) issues. This time, my monitors and printers just worked in the same way you would expect on a Windows machine, with no messing around in dot files, or desperate searching of online forums for the correct configuration.

After a while, I started to notice a few pain points. These were mainly around programs that were unavailable for Linux, such as WeChat. I could normally work around these programs but the program I really missed was iTunes. My wife and I both use iPhones and iPads, and as I'm the family IT guy, I'm responsible for backing them up and upgrading them. I created a Windows VM to install iTunes on and used this to backup the iOS devices. However the process for connecting the device to the VM, and forcing the host device to release its connection to the device was convoluted. What should be a 10-15 minute process to backup your iPhone would require an hour or two of configuring and restarting VMs and reconnecting devices.

As I only backed up the devices once a quarter, this was a pain but bearable. But the final nail in the coffin for my current Linux experiment was the news that Dropbox would no longer support (the default) Ubuntu encrypted hard disks from November. I've been a paying customer of Dropbox for seven years, and my workflow is optimised around using it. While I'm deeply unimpressed by their decision to yank support from their Linux customers, I'm not currently prepared to stop using it.

To make sure I would be able to continue to use Dropbox, I've rebuilt my laptop with Windows 10. It is great to be using the same OS both at home and work, and I've built both machines using the same configuration scripts. I had thought about running a dual boot setup at home, but as I already use Windows Subsystem for Linux (WSL), I didn't see it being very useful. If I need a full Linux setup, I can simply spin up a virtual machine.

While I miss the full control I had over my Ubuntu OS (with all the configuration you need to do to try and keep Windows 10 secure and private, you realise that you're very definitely not in full control of it), I do enjoy the ease of using the Windows OS. It is just easier to get stuff done. When I'm at home, I have limited time to work on personal projects, and I don’t want to be messing around trying to fight the OS. While I enjoyed playing with Linux, I'm not sure I'll be rushing to make it my primary OS again. While Linux has dramatically improved (particularly around hardware support), the lack of support for iOS devices, and the lack of certain proprietary programs mean that Linux still isn't ready for widespread use.

29 Aug 2018

No Tea in China...

My family and I were lucky enough to spend June in China visiting my wife's family. We spent a week in Hong Kong, and then crossed the border to spend the next 3 weeks in and around Shenzhen. I thought I post a few random thoughts and observations from my trip...

Hong Kong

I found Hong Kong to be a very friendly city, and I enjoyed the opportunity to use my very rudimentary Cantonese. Despite being a incredibly busy city of some 7 million people, it is a great place to visit as a family with a young child (though we passed on visiting the Peak – the tram didn't look pram friendly). Unfortunately, we spent most of our time organising visas for China, so we didn't see as much of Hong Kong as I hoped. My main impressions were the simply fantastic food (we had our best meal in a little local canteen that most tourists wouldn't look twice at), the excellent metro service (the Octopus card used widely in Hong Kong instead of cash), and a futuristic and incredibly efficient airport. 

https://sites.google.com/site/andyparkhill/home/blog-post-images/MeiAndMatthewinHK.jpg at a public park in Hong Kong


We spent most of our time in Shenzhen, a city in Guangdong Province, just to the north of Hong Kong. Most of my wife’s family live here, and we spent a lot of our time visiting her relatives and introducing them to our son Matthew for the first time.

Shenzhen is a massive city of over 16 million inhabitants that has sprung up in just 30 years from what was originally a small fishing town. The city is full of skyscrapers (residents are very proud of having the second tallest building in China) and appears to be continually rebuilding itself every few years. People revisiting the city after a few years often get lost, as the cityscape is unrecognisable from the last time they visited.

Shenzhen skyline at Night

Some random thoughts:

  • Rental bikes are absolutely everywhere, littering the streets, but relatively few people seem to be using them. A lot more people appear to be using battery powered bikes.
  • Unsurprisingly in this new city, there is still a significant wealth disparity, with large numbers of migrant workers from the rural China coming to the city for work.
  • Related to the above, you’ll more commonly speak Mandarin than Cantonese – the people you interact day to day with in Shenzhen (in taxis, in the supermarket) will typically be migrant workers and will not speak Cantonese.
  • Like Hong Kong, Shenzhen has a very efficient public transport system (both the local bus network and the growing metro system). The public transport makes up for the traffic in Shenzhen, which is insane.
  • Living costs in Shenzhen seem to be relatively low (for a Westerner), but high relative to the rest of China.

Whilst in Shenzhen, we paid a visit to the Huaqiangbei, the world famous electronics market. It was an interesting place to visit, but speaking to some of Mei’s family, the market seems to be in decline and employs fewer people than in the past.

As I’m currently working in a UK university, we visited a local university campus in Shenzhen. There are a number of different universities in Shenzhen, notably the Southern University of Science and Technology (see also this article). We visited the campus of the Peking University HSBC Business School, which is an international graduate business school, but we also passed by some of the other universities during our travels. It is striking how large the university campuses are, and how much building is going on at them. Higher education is a major focus of the Chinese government, and it is spending a lot of money to turn them into world class institutions. UK universities will not be able to compete on shiny new buildings alone, something which the UK VCs don’t seem to understand. The only way to compete with Chinese universities will be by delivering world class teaching and research – something that UK VCs are actively harming by treating staff as a hindrance rather than an asset. I expect a lot more Chinese students to opt to study in China in the future as the word standing of their universities continue to increase, along with increasing numbers of foreign students.

Enterance to a Hakka Village near Shenzhen

We also got a chance to visit the nearby smaller cities of Pingshan and Dayawan, as well as spending a couple of nights at a local holiday resort.

Other random thoughts:

  • I noticed a lot more use of people and manual labour (in construction and retail) than would be the case in Europe.
  • There is still significant inequality between the city and rural area. This is unsurprising for a country that has changed, and continues to change, so quickly in recent years..
  • There is a very real exercise and fitness culture in China. There were daily dance and Tai Chi sessions in all the local parks that we visited. The parks were filled in the evenings with people exercising (walking and jogging, playing badminton, attending dance classes) after work.

Exercising in an outdoor gym

  • Driving in China (and particularly Shenzhen) is insane. Defensive driving is essential, as drivers will drive very closely in traffic and will change lanes rapidly without indicating. Whilst the Chinese government is promoting self driving cars, they are a long way from ever appearing on China’s roads.  But given that road accidents kill 700 people a day in China, there will be a greater acceptance of self driven cars in China than in Europe or the US.
  • China is a tough place to visit as a foreigner. If you don’t speak or read the language, there is very little allowance given for you, unlike other countries in Asia. Shenzhen was significantly easier for a foreigner than the smaller cities, but still tough. The major street signs in Shenzhen were in both Chinese and English, which was a life saver. I knew a little Cantonese and very little Mandarin, and couldn’t read any Chinese characters. I’m already looking at learning more Mandarin and learning to read Chinese characters in preparation for my next visit.
  • WeChat is an essential app if you’re visiting China, as well as using a VPN to access websites outside of the Great Firewall of China.
  • Despite trying every coffee and tea shop I came across, I was unable to find a decent cup of black tea with milk. If I hadn’t packed my own teabags, I doubt if I would have survived the whole trip.

China is huge country, and I only spent a brief time there. But I really enjoyed our stay, and I came away impressed with the scale of the country’s ambition and the people I met there. I’m looking forward to our next visit to China already.

28 Aug 2018

Reducing Azure Costs

As part of my study for the Microsoft 70-533 exam, I’ve been interested in finding out how to minimize my costs in Azure. This is particularly important as I’m spinning up a large number of virtual machines and web applications as part of my study for the exam. I’ve gathered the following list of tips on how to reduce your Azure costs.


  • Each Azure service is priced differently. Use of the Azure Pricing Calculator to estimate the cost for a specific resource and to identify all the resources/services used to host an application.
  • While is is obvious that different services will have different costs, not everyone is aware that the same service will vary in cost between the different Azure locations. The https://azureprice.net/ site lists the regional pricing for Azure VMs.
  • You’re charged for the resources services you use, so shut down non-production resources when they're not required (like outside working hours). This can be done using either a scheduled script or by using one of the various cloud management applications.
  • Use pre-paid subscriptions to get a discount (even on pay as you go subscriptions) - Microsoft offers discounts of 2.5-5% based on 6 or 12 months pre-payment.
  • If available on your subscription, make use of a spending limit on your Azure account/subscription.
  • Also set up automated email alerts to email you once your account spends more than a set amount.
  • Audit your Azure usage so you know what you are paying for and not using in Azure
    • View your Azure Subscription in the portal
    • Use the free service https://azure-costs.com/ to identify your costs
    • Consider using the Azure Cost Management service offered by Cloudyn, a Microsoft subsidiary.
    • Delete unused resources
  • Use auto-scaling to reduce costs during off hours. These deployment types all support auto-scaling:
    • Cloud Service
    • App Services
    • VM Scale Sets (Including Batch, Service Fabric, Container Service)
  • Scaling could also mean shutting your app down completely. App Services have a feature called AlwaysOn that controls if the app should shut down due to no activity. You could also schedule shutting down your dev/QA servers with something like DevTest Labs. There are also third party services like Park My Cloud.

Virtual Machines

  • Need to shut down and deallocate VMs to stop Azure continuing to reserve the VM's compute resources
    • Use Azure Portal or PowerShell to do this
    • Can also configure the auto shutdown option on the VM
  • Resize over-provisioned VMs
    • Note, the VM will have to be rebooted, so need to avoid when the VM is under peak load
  • Use Azure PaaS features; don’t roll your own on VMs unless you have to.
  • Make use of Dynamic Scaling with VM Scale Sets
  • Use Azure Marketplace VM Images to prevent additional licensing costs (such as for the Windows OS)
  • Use Azure DevTest Labs to automate the start-up and shutdown of VMs, and also implement quotas and policies for VM management
  • Use Reserved Instances to get discounts on the price of VMs - need to reserve for 1 or 3 years
  • Make use of existing Microsoft licences in Azure using the Azure Hybrid Benefit
  • Make use of Azure Site Recovery to back up your VMs
    • Only pay for replication software and the cost of storage

Azure SQL

  • Blob storage offers a cost effective solution to store graphics data. Blob storage of type Table and Queue of 2 GB costs $0.14/month and type block blob costs just $0.05/month. A SQL Database of similar capacity will cost $4.98/month. Hence, use blob storage to store images, videos and text files instead of storing in SQL Database. To reduce the cost and increase the performance, put the large items in the blob storage and store the blob record key in SQL database.
  • If you want to develop, test, or build a proof of concept, then use the freely licensed SQL Server Developer edition. A SQL Server Developer edition VM only incurs charges for the cost of the VM, because there are no associated SQL Server licensing costs.
  • If you want to run a lightweight workload in production (<4 cores, <1 GB memory, <10 GB/database), use the freely licensed SQL Server Express edition. A SQL Server Express edition VM also only incurs charges for the cost of the VM.
  • Use the appropriate edition of SQL Server for other tasks.
  • Use the Pay Per Usage (per second cost of the VM including the SQL Server licence) or the Bring Your Own Licence (BYOL) option as appropriate.
  • Paying the SQL Server licensing per usage is recommended for:
    • Temporary or periodic workloads. For example, an app that needs to support an event for a couple of months every year, or business analysis on Mondays.
    • Workloads with unknown lifetime or scale. For example, an app that may not be required in a few months, or which may require more, or less compute power, depending on demand.
  • Bringing your own SQL licensing through License Mobility is recommended for:
    • Continuous workloads. For example, an app that needs to support business operations 24x7.
    • Workloads with known lifetime and scale. For example, an app that is required for the whole year and which demand has been forecasted.
  • If you have a lot of databases, consider using SQL Elastic pools.
  • Optimize your SQL Database performance

App Services

  • Migrate Apps to Azure App Service, instead of 'Lift and Shift' to a dedicated VM. This is a cheaper option and avoids having to manage a VM.
  • Avoid paying for staging slots on Cloud Services:
    • Delete staging slots as quickly as possible, as these are charged at the same rate as a Production slot
    • Cloud Services cost you money even if the servers are stopped. You have to delete them!
  • Combine web apps with Azure App Service Plans to reduce server count.
  • Stop using Cloud Service Web Roles unless you have to. Use App Services instead - these are also faster to deploy to.
  • Move Worker Roles to Azure Service Fabric or Container Service. This allows you to combine apps and reduce server count and cost.


  • Delete the unused VPN gateway and application gateway as they will be charged whether they run inside virtual network or connect to other virtual networks in Azure. Your account will be charged based on the time gateway is provisioned and available.
  • Use Azure Storage for SMB File Shares, instead of a dedicated VM. Again, this is a cheaper option and avoids having to manage a VM.
  • Improve the performance of your application to reduce the amount of cloud resources it uses. Make use of logging and performance monitoring tools/profilers to identify ways of improving your app.
  • Consider using a CDN like CloudFlare for additional caching.
  • Consider the Azure cold storage option for reduced costs
  • Consider using Azure Automation to automatically build and tear down services/resources when required.


21 Aug 2018

Microsoft Exam 70-533 - Implementing Microsoft Azure Infrastructure Solutions

As I develop solutions using Azure, and because I want learn more about the platform, I’ve decided to study for the 70-533 Implementing Azure Infrastructure Solutions exam. I have mixed thoughts about certifications, but this one will help in my day-to-day work and seems to be in demand with employers.

I have chosen to do the IT exam (70-533) first, rather than the Developer exam (70-532), as there is more of a focus on PowerShell in 70-533, and I love working with PowerShell. There is also significant overlap between the 2 exams, and I do plan on completing 70-532 at a later date.

As I study, I’ll be posting some blog posts on various Azure related topics. I’ll use this post to list all Azure related posts:

  1. Reducing Azure Costs

10 Aug 2018

Gmail Search and Filters

A quick post about the idiosyncrasies of searching and filtering in Gmail. As a long time Gmail user (since 2008), I’ve made significant use of filters and labels to organise my emails. I also make use of several different domains and email accounts that I manage using Gmail as aliases. However, as I look to move off Gmail and away from Google services, I want to identify exactly who is emailing me with what email address. Specifically, what emails are sent to an alias using the plus operator (andy+newsletter@gmail.com instead of andy@gmail.com)?

Note, andy@gmail.com is NOT my email address, it is simply being used in this post as an example.

To help with this, I have deleted all my existing labels and filters in Gmail, and created new filters that label my mail according to the email address it was sent to, using search terms like to:(*@gmail.com), to:(*@googlemail.com) and to:(*@andyparkhill.co.uk). Note, I since have read and confirmed myself that the wildcard operator “*” doesn’t really work in Gmail search.

By default, the to:(*@gmail.com) filter will capture any email sent to aliases using the plus operator (like andy+newsletter@gmail.com), but it will not capture email sent to aliases that make use of periods (like an.dy@gmail.com).

I had hoped to be able to create a separate label for email sent to the email aliases, but after spending significant time reading up on Gmail search operators and experimenting, I now realise that this is simply not possible. Instead, I can use the existing Gmail label to help search for any emails sent to my Gmail aliases (using the plus operator) using the following search term:

“label:gmail to:-andy@gmail.com” (without quotation marks)

This search will return emails sent to andy+newsletter@gmail.com and andy+hackernews@gmail.com, but not those sent to andy@gmail.com.

Another useful search term I’ve started using is:

“-has:userlabels -in:sent -in:chat -in:draft -in:inbox -from:me” (without quotation marks)

This search will return all emails that are not labelled and that have not been sent by me.

Some useful links:

8 Jun 2018

The Future of SharePoint Development

This post is the sixth and last in a series looking at how SharePoint development has changed over the years, and how it is likely to change in the future. In this post, we will look ahead to the future of SharePoint development.

More posts in the series:

SharePoint Online and Office 365

Following on from the release of SharePoint 2016, SharePoint Online is the modern SharePoint platform. New features come first to the cloud, and when appropriate, make their way to the on-premise version. As such, it now makes sense to discuss the roadmap for SharePoint Online and Office 365 before considering the on-premises version.

At the SharePoint Virtual Summit in May 2017, Jeff Teper (Corporate VP of Office, SharePoint and OneDrive at Microsoft) reinforced SharePoint’s central position in the Office 365 business. This was seen as an attempt to settle the uncertainty about the platform’s future prior to the release of SharePoint 2016. Following this, at the 2017 Ignite conference in September, Microsoft announced the following features for SharePoint Online:

  • A number of new web parts would become available (such as a Microsoft Forms web part, a planner web part, the file preview web part, the activity web part to see what others in your organisation are working on, and a 3D file viewer web part). These were released in January 2018.
  • Better Teams and Groups integration, including being able to launch Teams directly from SharePoint, and being able to embed individual SharePoint pages within a Teams tab.
  • A number of Mobile app improvements, such as improved native mobile sites, an interactive news feed and bookmarking functionality.
  • Improved user experience on intranet sites:
    • Adding more web parts and layouts for communication sites
    • Adding News pages to both Team and Communication sites. These will be similar to wiki pages, but will allow users to easily add content that looks well.
    • Improved people cards making use of a LinkedIn integration to retrieve data for people in your organisation.
    • Streamlined custom site provisioning
    • Improved tool box for web parts to easily add web parts to any page.
    • Hub sites that provide a summary of content from other designated sites. These were released to SharePoint Online in March 2018, but they will not be supported in the initial release of SharePoint 2019.
      • Additionally, when a site is joined to a Hub site, the site inherits by default the Hub’s them and offers the ability to run Site Scripting methods
  • List improvements, including the ability to add PowerBI items and Bing maps into list view forms, and no more paging for large lists.
  • Improved search functionality, such as personalized search results, an updated results page, live preview of files and the inclusion of folders in the search results (released December 2017)
  • Security & Governance
    • A new SharePoint Admin Center, based on the Office 365 Admin Center. This will allow administrators to analyse file-based and site-based activities and manage all sites from a single page. Released in January 2018.
    • Improved functionality to manage your SharePoint sites, including previously hidden sites created by Groups/Teams.
    • Site-level conditional access –based the device the user is currently using.
    • Set compliance based on a user’s location, and the applicable regulations, a feature known as ‘Multi-Geo Capabilities in Office 365’.
    • Service-level encryption
    • Self-service OneDrive restore (released in January 2018)

Additionally, at the SharePoint Conference North America two weeks ago, Microsoft announced that the following features would be coming to Office 365:

  • SharePoint Spaces will allow users to make use of AI and VR/AR content for immersive experiences
  • Other Office 365 applications will start making use of AI capabilities for content  collaboration:
    • Personalized intelligent search in the SharePoint mobile app
    • The updated Office.com site that uses AI in the Recommended and Discover sections to personalize the recommendations to each person
    • Enhanced image searches
  • Organisational news and page management for intranet communications
  • GDPR and multi-geo data residency for SharePoint
  • Improved integration between SharePoint Online and Microsoft Teams

As you can see from above, and from the Office 365 roadmap, there are a number of new features being actively developed for SharePoint Online. The majority of changes to SharePoint Online are user interface improvements, new features to allow organisations to manage their data in the cloud, and greater integration with other Office 365 services.

From the 2017 Ignite conference, it was made clear that OneDrive would play a more substantial role in Office 365 and that it would become the “universal way to access all your files”. New features include:

  • Files On-Demand – see and sync all your files (coming to Windows 10 this summer); open, access and share files right from desktop
  • Share seamlessly from File Explorer and Mac Finder
  • Expanded administration controls for sharing
  • Able to share data securely without those accessing requiring a Microsoft account
  • File previews (using the same file preview web part used in SharePoint)
  • A scan button n the OneDrive mobile app to allow users to quickly capture photos, receipts, and documents. They will also be able to have OneDrive automatically upload photos to Office 365 for further processing (such as text extraction)

There has never been a clear distinction between SharePoint Online and OneDrive – they are both online document services used for storage and file sharing. Typically, OneDrive is used for personal storage while SharePoint is used to store and collaborate on documents for an organisation/department/team. The above changes to OneDrive seem to blur the distinction between the two products further.

Microsoft is now pushing the cloud model as the main way of using Office applications, with your content and your applications being available wherever you are and on whatever device you are using. As part of this push, Office 2019 will likely be the last version to be sold with a perpetual licence (bought outright, as opposed to buying a subscription to the software). Future versions will be tied to an Office 365 subscription.

SharePoint 2019

Microsoft confirmed at the MS Ignite conference in September 2017 that it would release SharePoint 2019 towards the end of 2018, with the first previews due soon. The following features are likely to be included in SharePoint 2019:

For Users:

  • More modern page layouts and site template (such as the Team News and Communication sites). However, the new Hub Sites that have been recently released in SharePoint Online, will probably miss the initial release of SharePoint 2019 (but will probably be available via a Feature Pack at a later date).

  • The modern look and feel of SharePoint Online will become the default look in the on-premise version, leading to an updated modern looking UI. The classic pages will remain, for this release at least, as a fall back. This includes:

    • The improved SharePoint Admin Centre

    • Suit Navigation

    • App Launcher

    • Modern sharing experience

  • Improved file activity, usage tracking and stats for files

  • An updated NextGen OneDrive Sync Client, with no support for SharePoint 2016. This will likely include the Files On-Demand feature to provide the selective synchronization of SharePoint-based documents.

  • Significant improvements to lists including:

    • Improved list creation user experience

    • The ability to copy and paste from Excel into a SharePoint List (eliminating the need for datasheet views)

    • Row formatting

  • An updated UI for the SharePoint mobile app

  • Improved support for Hybrid architectures. This will include greater integration with Flow and PowerApps, the successors to InfoPath, allowing for data to move more easily between on-premise and cloud based SharePoint instances. These are seen as ‘no code’ business process tools for power users.

    • Forms (used in list and libraries) can be edited with PowerApps to create custom forms (the old use case for InfoPath)

    • Microsoft Flow will become the default workflow engine, although old workflows will remain usable

For Administrators:

  • Direct links in Central Administration to SharePoint documentation
  • SMTP authentication when sending emails (including Office 365)
  • Microsoft will be releasing Workflow Manager 2019 to replace Workflow Manager 1.0
  • InfoPath, despite being deprecated, will continue to be supported in SharePoint 2019. InfoPath has an extended support date of July 2026.

    • Similarly, SharePoint Designer is likely to be still supported in SharePoint 2019, which just goes to prove the saying “the ones you hate never, ever leave”. However, Jeff Teper suggests that SharePoint Designer is unlikely to be needed, as all customization should be possible using the new web parts.

    • Improved Hybrid administration options:
      • A new SharePoint Hybrid status bar to allow you to monitor the health of your hybrid configuration
      • Direct links in Central Administration to the Hybrid Configuration Wizard
      • OneDrive in Office 365 will be used by default in Hybrid scenarios, rather than on-premise.
      • The Modern Office 365 Search will be used to give a consistent experience in a hybrid environment
      • The ability to configure hybrid scenarios whilst installing SharePoint 2019
    • Use of # and % characters will be allowed in file and folder names
    • Increase in the URL path limit to 400 characters
    • The SharePoint Migration Tool will be included to move content from SharePoint on-premise databases to either SharePoint Online or the OneDrive Service, that will include PowerShell support.

    For Developers:

    • Greater support for the SPFx development model. with a continued push to make this the default development model.

    The major new features announced so far for SharePoint 2019 are the Modern user interface and the improved integration with Office 365 for hybrid architectures.

    Developing for SharePoint 2019

    With the release of SharePoint 2019, the SharePoint Framework (SPFx) will become the only framework to survive two releases as the preferred development model. This will hopefully encourage the developer community to start using it more. SPFx became available to use with SharePoint 2016 on-premise following the release of Feature Pack 2 in September 2017.

    Microsoft is continuing to reboot the UI with modern pages becoming the default page type, and with SharePoint 2019, developers will be able to use the SPFx to develop web parts for the Page model. There are unfortunately no plans to bring the modern UI to SharePoint 2016 on-premise.

    I have mixed feelings about the SharePoint Framework. I feel that (at last) Microsoft has given developers a development model worth using, and we had to break from the legacy SharePoint server-side development practices. The SharePoint Framework embraces web development best practices and tooling, and opens up the SharePoint development to a wider audience of developers. However, as the majority of organisations are still on SharePoint 2013 or earlier (see below), we won’t see a noticeable adoption of SPFx for a number of years.

    It will be interesting to see if Microsoft finally moves to kill off some of the legacy development models. While it is clear that full trust server side solutions will remain (as they are required to make up for the limitations of the SPFx model), will SharePoint Add-Ins and JSLink still be supported in SharePoint 2019? Add-in web parts can be added to modern pages (in SharePoint Online), but I'm not aware of many people using them.

    As noted above, Microsoft has again focused on improving support for hybrid architectures. In particular, the integration of PowerApps and Flow is only for hybrid scenarios, not for on-premise only.


    I started writing the initial version of this post several years ago as a rant against the App Model and SharePoint Add-Ins in SharePoint 2013. I could clearly see that, despite the protestations of Microsoft evangelists, the App Model was increasing the complexity of SharePoint solutions, while at the same time offering reduced capability compared to full trust solutions. Other developers clearly felt the same, as very few used SharePoint Add-Ins. This lack of adoption led Microsoft to launch their latest development model, the SharePoint Framework, effectively killing the App Model.

    As I wrote and rewrote the rant, it morphed into the current series of posts that look at how SharePoint development has changed over the years. Clearly, in a product that is approaching 20 years old, the recommended best practices are going to change over time as new features are added and development methodologies and tools are adopted, modified and dropped. However, SharePoint developers have seen Microsoft release 4 major development models in the past 10 years. This profusion of development models has been driven by Microsoft’s increasing focus on SharePoint Online and Office 365. This caused the identity crisis I described in my post on SharePoint 2010 – was SharePoint an online service or on-premise software?

    This question was definitively answered with the release of SharePoint 2013 – SharePoint was an online service. This led to doubts if another on-premise version of SharePoint would be released. In the end, Microsoft’s large enterprise customers effectively forced the release of SharePoint 2016, as they were (and largely still are) unwilling to trust their data to Microsoft’s cloud.

    Cloud Adoption

    Microsoft’s focus is clearly on cloud based subscription services, and it has been recommending that organisations move to SharePoint Online since 2014. At the 2016 Future of SharePoint event, a Microsoft speaker revealed that 90% of its 70,000 internal SharePoint sites were hosted on Office 365. At the recent SharePoint Conference North America, it was revealed that:

    • 400,000 organizations are on SharePoint
    • 70% of all seats are in the cloud
    • 135 million active users in Microsoft 365

    It is worth digging into these figures. While 70% of all seats may be in the cloud, these seats will be purchased as part of an Office 365 subscription. These may not be used – based on this survey, around 70% of organisations with Office 365 subscriptions will actually make use of SharePoint Online. (In the same survey, very few organisations valued the SharePoint Online service, unlike email and Office apps – it seems Microsoft has some way to go to deliver business value to customers using SharePoint Online).

    Microsoft 365 is a package of Windows 10, Office 365 and Enterprise Mobility & Security. I’ve been unable to find a breakdown of the Microsoft 365 figures – does the quoted active users figure include people who have purchased Windows 10 (direct from Microsoft or via an OEM)? How many of those users are in subscriptions that have been paid for (as opposed to free student subscriptions)?  Until I see such a breakdown of paid versus free subscriptions, and details of the usage of SharePoint Online and other applications by subscribers, I’m dubious as to how useful or relevant these figures are.

    More meaningful figures can be found in Concept Searching’s 2017 SharePoint and Office 365 State of the Market Survey White Paper. In this survey, most organisations that responded indicated that they were using SharePoint 2013 or earlier (around 50% were on SP213, and a massive 40% were still on SP2010). These figures are backed up by an earlier report from Rencore (The State of SharePoint and Office 365 Development 2016). Only around 20% had migrated to use SharePoint 2016, due to the release’s lack of compelling business features. Around 50% of the organisations used SharePoint Online/Office 365 in some form. Just over 20% planned to migrate to SharePoint Online/Office 365, and around the same figure were considering a hybrid architecture.

    The white paper makes clear that as organisations grow more aware of the benefits and disadvantages of cloud based services like Office 365, they are less keen to replace on-premise applications wholesale:

    ‘The move to the cloud is happening,but not in the droves Microsoft anticipated. Many organizations are now spending the time to develop cloud and cybersecurity plans,and are much more knowledgeable about the pros and cons of “life in the cloud”. As such, they are in no rush to move to the cloud, and, in fact, plan on keeping key applications on-premises.

    The key reasons for organisations not wanting to move SharePoint on-premise to the cloud are:

    1. Large organisations have invested and customized their existing (generally complex) SharePoint on-premise deployments. Moving to the cloud would require losing this investment, and may require significant work to replicate in Office 365.
    2. The long term price for an organisation’s Office 365 subscriptions could greatly exceed the price of their existing on-premise deployment.
    3. Avoiding a costly migration to Office 365 that would involve reviewing content, updating IT infrastructure and considering the governance and security implications. (It is important to note that organisations using SharePoint have typically been slow to to migrate to newer versions, as the migration is a complex and time consuming process. The constant churn of development models has also helped contribute to a reluctance to upgrade to the latest version – no organisation wants to invest time and effort in a development framework only for Microsoft to drop it a year or two later.)
    4. Security and compliance – organisations are afraid of entrusting their data to Microsoft. they are also aware of how difficult it would be to export their data once it was stored with Microsoft data centres, possibly in a different country. This is particularly important following the GDPR legislation came into force.
    5. Restricted customization options available in Office 365 – Microsoft limits significantly how SharePoint Online can be customized. Organisations using SharePoint on-premise have full control over how they use and customize their solutions.

    The main takeaway from the white paper for me was that the majority of organisations are still on SharePoint 2013 or earlier. While most organisations are also making use of Office 365, I suspect this is mainly for email, OneDrive and MySites. Migrating existing sites from on-premise to Office 365 with their customizations seems to be quite rare. It also means that adoption of SPFx is likely to be slow, as organisations using SharePoint 2013 or lower are unable to use it.

    One key reason for organisations not upgrading to more recent versions of SharePoint is the lack of compelling features in the latest releases. Both SharePoint 2016 and 2019 are incremental releases that offer relatively little to organisations already using SharePoint 2010 or 2013. Customers still on SharePoint 2013 may be tempted to upgrade to SharePoint 2016/19 as mainstream support has now ended, but new customers will be more likely to use SharePoint Online and Office 365, something that the Concept Searching white paper also highlights.


    To encourage the legacy on-premise customers (generally large organisations) to move online, Microsoft has been pushing hybrid architectures, and there is increasing support for hybrid scenarios in SharePoint 2019. In particular, Microsoft is keen to offer online services that can replace customers’ customized on-premise solutions, to allow organisations to avoid costly rewrites of existing custom code. This will also allow Microsoft hook into an organisations data and ease any future migration to the cloud.

    However, organisations who use hybrid architectures have learnt that it greatly increases the complexity of, and the time to implement and deploy, SharePoint solutions. Organisations running a hybrid environment will face a number of usability, maintenance and administration issues, and in reality it is only a viable option for larger companies that have dedicated SharePoint developers and administrators. Additionally, there are are cost implications of running a hybrid solution, as you have the cost of your on-premise farm and the monthly subscription costs.

    The Future of SharePoint and Office 365

    The future of SharePoint is clearly as an online service within the wider Office 365 suite of services and applications. The modern SharePoint application looks to be focused on collaboration services, enhanced with AI and VR. There have been significant new features released recently to improve the use of SharePoint Online as an intranet. Many former SharePoint features now being handled by other teams (PowerBI for business intelligence, Delve for search, PowerApps for forms, Flow for workflow). This points to a trend in Office 365 of having lightweight apps with specific purpose, such as Microsoft Forms.

    Additionally, there is a greater emphasis in modern SharePoint on integration with other O365 applications and services, and fewer customization options available. This applies to both branding (there are limited ways of branding SharePoint Online), and also in custom code options (the SPFx offers significantly fewer options than a full-trust solution). To compensate, Microsoft have been pushing no code tools like PowerApps and Flow that can be used by power users.

    For developers, developing for SharePoint Online will mean consuming a series of end-points from the various online services and applications being integrated with SharePoint. While we will be able to make use of modern web development techniques, I suspect there will be significantly less requirement to develop custom solutions as Microsoft develops new applications and services that can do a similar job for a monthly subscription. This allows organisations to have a low cost solution (initially – monthly subscriptions can quickly add up), and further locks the organisation into Office 365.

    SharePoint on-premise will continue to be released for legacy customers, for at least one more version (2022 anyone?), but is basically going to become a container with basic functionality that allows easy integration with other subscription Office 365 services. This is very similar to the path being taken by Exchange on-premise. It is unlikely that many new customers will invest in SharePoint on-premise now, specifically with Microsoft pointing towards Office 365 as the preferred option. For existing customers, the decision to upgrade will become harder, as each new release of SharePoint offers less and less business value. SharePoint is a mature product – is unlikely to change significantly in the basic features it offers.

    SharePoint is an outstanding product that has evolved significantly over it’s twenty year history. After many years battling it, I’m finally moving on to other areas of development (Azure and Xamarin, and also some Linux/Python development on the side), but I’ll always have a soft spot for it. Many thanks to all those readers who have struggled on to read this last instalment of a series that began over 18 months ago.

    22 May 2018

    How to Fix: ‘The trust relationship between this workstation and the primary domain failed’

    I come across this error several times a year, typically when I start up a Virtual Machine that hasn’t been used in a while. Many people will recommend leaving and then re-joining the AD to resolve this problem – this is a terrible idea. I came across this solution a couple of years ago, and after having to search it for again today, I decided to post it here so I can find it more easily in the future.

    To fix the error, reset the computer password using the netdom command line utility:

    netdom.exe resetpwd /s:<server> /ud:<user> /pd:<password>


    • <server> is a domain controller in the joined domain. You can get the domanin controller name using this command: nltest /dclist:domainname
    • <user>is in DOMAIN\User format, and has the required rights to change the computer password

    I typically log on to the affected virtual machine using a local administrator account. I run the command above using  an elevated command prompt. Once the command succeeds, simply reboot the machine, and you will then be able to log on using a domain account as normal.

    12 Apr 2018

    Moving to GitLab

    I’ve recently moved my repositories from GitHub to GitLab. I thought I share a quick post on why I did this and how to go about it.

    Why Move?

    GitLab has a number of advantages over GitHub for source code hosting:

    1. Free private repositories – GitLab is financed by their Enterprise tiers, and have a free tier for personal projects and small teams. GitHub is free only for public repositories.
    2. GitLab has built-in Continuous Integration/Delivery options built-in, and has a number of DevOps features available to integrate with your specific workflow.
    3. GitLab has a wider range of options for creating project specific sites (using GitLab Pages), such as allowing you to use custom domains and SSL certificates for each project.
    4. GitLab, unlike GitHub, has an open source community edition (with an MIT licence), allowing you to setup your own GitLab server to self-host.
    5. GitLab has pretty much all the features that you expect from GitHub (Wikis, Markdown-based readmes, issue tracking). An important consideration for me was that GitLab, like GitHub, also offers Two Factor Authentication (2FA).

    The disadvantages of moving to GitLab are:

    1. As GitHub has been around longer, network effects mean that the majority of developers and projects are already using it. As of 2017, GitHub had some 26 million user and 67 million repositories.
    2. Occasionally you can have slow UI performance.
    3. Both offer integration with a number of third party services (such as Trello, Slack, Jenkins) but GitHub currently offers more integration options than GitLab.

    How to Move

    GitLab has made made migrating from GitHub very easy. Simply create a new project, and on the new project page, select the ‘Import project’ tab. From here, you can select the GitHub import option. You will then be asked to sign in to GitHub and to authorize GitLab to access your GitHub repositories. On doing this, you are taken to the GitHub Importer page, which lists your projects on GitHub. You can then select the projects to import to GitLab individually, or simply import them all:

    Note, the import preserves the repositories commit history. Once the projects are successfully imported, you can then decide whether to delete the source projects on GitHub. I deleted all of my private repositories on GitHub, and maintained only my public projects.

    As GitHub is the more popular site for hosting source code, I wanted to be able to keep hosting and updating my public projects there easily. I had thought initially to add GitHub as a second remote to the new GitLab repository, so I could update both GitHub and GitLab repositories from the command line. However, it turns out that GitLab offers the Repository Mirroring feature. From the GitLab documentation:

    There are two kinds of repository mirroring features supported by GitLab:
    push and pull. The push method mirrors the repository in GitLab
    to another location, whereas the pull method mirrors an external repository
    in one in GitLab.

    By configuring the public repositories to push to the corresponding GitHub repository, I can simply commit my code to the GitLab repo as normal, and the commits are automatically reflected in the mapped GitHub project.

    Once the push mirroring was configured, I found that I could commit to GitLab and the code changes would be pushed to GitHub in a matter of seconds!

    Migrating to GitLab was a very simple process, and so far, I’m very happy that I made the move.