After is Now Open Source

I've started a complete rewrite of After, which will now be a cross-platform desktop app with a decentralized client/server architecture.  I've also open-sourced the code and moved it to GitHub.

After I have a solid foundation, I'll really focus on documentation.  Then I'll be looking for others to collaborate with me on it.


Blog Migration

... Or I can copy and paste it all manually.  Which is what I just did.  And I never want to do that again.  :)

Azure Limitations

I've run into a few limitations with Azure that some might find interesting.

First, when I was building the app services for various websites and desktop applications, I had to decide which pricing tier to use for each.  There are storage size and data transfer limits for the free and low-price tiers.

So I made two different service plans: a free one for web apps/sites that use very little resources and a higher-tier paid one for the more demanding services.

I didn't take into consideration how many concurrent websocket connections some apps might need because they're very light on resources, and I didn't see any mention anywhere of websocket limitations.  There was just an on/off switch in the Azure portal.

Well, long story short, I found that there are some pretty stringent limitations for the lower-price service plans.  The free one only allows 5 concurrent websocket connections.  It will reject any connection attempts after that.

This is a little upsetting since those plans are already being metered for data transfer and CPU usage, and additional websockets doesn't inherently utilize more resources outside of those two items.  So it's an obvious ploy to force people into higher price tiers.

Secondly, I found that the MySQL database required for WordPress costs as much as all my web apps combined.  So I'll be switching to BlogEngine.NET soon, which allows you to store everything in JSON files.  It'll take a while to migrate all the data, though.  It's looking like I'll need to write a custom script for it.

Move to Azure Completed

After a few bumps and mishaps, I finally have everything moved to Azure.
I have to admit, it was difficult to let go of my domains at first.  It felt kinda like I was losing part of my identity.  However, in the end, content can exist anywhere, and Azure is already helping me simplify my life.
A new version of CleanShot is on the way with lots and lots of new features.  I'll be posting about that soon.  :)

Changes to Websites and Services

I'm in the process of moving all my websites and services to Azure.  There will be many interruptions to services during the transition.

Affected apps and services include the Translucency website, all applications under Downloads and Projects, the InstaTech website, the InstaTech Package Builder, and After.

All sites and services will soon be running from subdomains of *  I'll be dropping the,, and domain names once they expire.  In the meantime, they will redirect to their replicated site on Azure.  Only the http protocol will redirect properly.  I'm no longer maintaining SSL certificates for any of the domains.

This means that all the links pointing to the HTTPS versions of my websites will not work.  The Azure sites, once redirected to them, are encrypted.

Work on all of this should be completed by the end of July.  Until then, many apps will be broken until I update them to use the new services.

In the end, this will significantly reduce the cost and time required to maintain everything.

InstaTech Connections Overseas

I had the pleasure of working with an InstaTech user from Portugal this morning!  He's an IT professional who was wanting to install InstaTech server, but there was an error when trying to complete the setup.

We were able to fix the problem, and in the process, I identified an area where I can improve the error reporting.  But the really cool thing was that I got to test using InstaTech from my server to connect to a computer overseas.  The computer also running Windows with a different language!  InstaTech performed very well, and I'm really pleased with how it's coming together.

It was an exciting and rewarding experience in many ways.  Every time I meet a new person through one of my apps, I get a brief glimpse into different parts of the world and different people's lives.  It gives me this feeling of how big the world is, yet how small at the same time.  And the feeling of inclusion and togetherness that comes from that can't be properly labeled.  It's amazing.

SSL Requirements for InstaTech

Translucency – News and dev blog

Occasionally, I run into this situation: Someone wants to install InstaTech Server to evaluate it, but they get stuck at the SSL certificate installation part. InstaTech currently enforces SSL, so the certificate installation is necessary. I’m happy to help people get a free Let’s Encrypt certificate installed, but it’s not always possible.

I made encryption mandatory because I didn’t want to put anything out there that had any potential for being used in an unsafe manner. My thought is that someone who is less informed about the importance of encryption may opt to skip it if allowed and put themselves and their customers at risk.

I attempted to make this as easy as possible, though. After running the installer, a quick start guide opens, and the first thing is a link to  This is by far the easiest way to get a certificate installed on an IIS server, in my opinion.

But recently, I’ve been thinking about those who are aware of the security implications, but simply want to test it internally first.

The first idea I had was to allow an unencrypted connection, but throw up huge warning messages. The second was to only allow it if it’s on the same subnet or domain (or some other manner of determining if it’s on the LAN).

What are your thoughts? I’d love to hear them.


Inside StorageLists

When I began writing the back end for After, I started with Entity Framework/SQL Server.  I wanted to use the opportunity to teach myself the Code-First approach, as I’ve previously only done Database-First.

I really enjoyed working with Code-First, and it was a worthwhile investment of time.  However, I soon found that  SQL Server isn’t well-equipped to handle multiple persistent connections that need to share a static context.  (Please refrain from saying, “Duh, I could have told you that!”  :D)

Instead of searching for something else, I decided to get creative and write my own.  I already had my data modeled in such a way that I could reference everything by ID/PK, and I liked how Code-First used the DbContext class and DbSets.  So I put those concepts into StorageLists.

The StorageList is basically a List<> that writes items to disk after a set period of time of not being accessed.  This timeout can be adjusted, but the default is 3 minutes.  Also, the item must pass the PersistenceFilter, if specified, or else is held in memory and never written to disk.  Items written to disk are serialized to JSON.

Your models for each StorageList must implement IStorageItem, which adds the StorageID property and LastAccessed.

Here’s the example use that’s in the readme:

using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Linq;
using StorageLists;

namespace MyProject
    public class World
        // Static data context.
        public static World Current { get; set; } = new World();
        public World()
            // Set the folder storage location for each StorageList.
            var server = HttpContext.Current.Server;
            Locations.FolderPath = server.MapPath("~/App_Data/World_Data/Locations");
            People.FolderPath = server.MapPath("~/App_Data/World_Data/People");
            // Only locations with a population greater than 5000 will be written to disk.
            Locations.PersistenceFilter = new Predicate<Location>(loc => loc.Population > 5000);
        public StorageList<Location> Locations { get; set; } = new StorageList<Location>();
        public StorageList<Person> People { get; set; } = new StorageList<Person>();