Setting up SSL/HTTPS with Let's Encrypt for

With an upcoming version of Firefox set to warn users when they’re browsing on a site without SSL encryption enabled, I figured it wouldn’t hurt to get it all set up for this domain - - as an experiment. I knew that with Let’s Encrypt, the SSL certificates are free. It just requires a bit of work to get them set up.

Luckily, at DigitalOcean - where I host this site/domain - they have this handy guide for getting HTTPS set up with LetsEncrypt (for Ubuntu Linux). Since I’m using the same HTTP server (nginx), I was able to follow along without needing any extra help.

I did have to adjust my DNS setup at Namecheap, since I had a wildcard (*) catch-all domain record. Once all I had was my necessary DNS entries (minus the wildcard), I followed along with the instructions pretty much as given.

Here’s the quick version of the setup instructions:

# add certbot repository
sudo add-apt-repository ppa:certbot/certbot

# update package list
sudo apt-get update

# install the python certbot client
sudo apt-get install python-certbot-nginx

Next, make sure that you have an nginx configuration file with the proper domains listed. In my case it was in the file /etc/nginx/sites-available/ where the config included:

server {

They then explain to make sure that if you’re using a firewall, to allow both HTTP and HTTPS traffic through. Their example is with the ufw utility, but I double checked that I’m allowing both kinds of traffic through with the firewall I’m using, iptables:

$ sudo iptables -L
Chain INPUT (policy ACCEPT)
target     prot opt source               destination
ACCEPT     tcp  --  anywhere             anywhere             tcp dpt:http
ACCEPT     tcp  --  anywhere             anywhere             tcp dpt:https

We now need to get a certificate using the certbot client (again, using my own domain as example):

sudo certbot --nginx -d -d

You’ll be asked a few questions, but the process is quite quick. When it asks for the HTTPS settings, I asked it to make all non-HTTPS requests redirect to the HTTPS domain. If you look in your appropriate nginx configuration file, it should look like this:

# Redirect non-https traffic to https
if ($scheme != "https") {
  return 301 https://$host$request_uri;
} # managed by Certbot

If that section is still commented out, then you’ll need to uncomment and then restart nginx with: sudo service nginx restart

Once that’s done, and there were no hiccups in any of the steps above, you should now be serving your site securely. So now, this blog is now served only over HTTPS! This was so much easier than I had expected it to be. And as a part of the certbot setup, it installs a recurring task to update the certificate when needed - since Let’s Encrypt certs only last for 3 months at a time.

Once I get everything set up and going with Lists of Bests, then I’ll sort of know what I’m doing.

2017-11-09 — #nginx #security #site
~ ~ ~

How I'll build the new Lists of Bests

When Lists of Bests launched in 2003 (see previous post), the site consisted of a few Perl scripts and maybe an external CSS file. I don’t think I used a bit of JavaScript on the site at all, and my design skills were less than competant.

Despite all that, the site did quite well. I can’t recall the total number of users I had on the site at the time I sold it to the Robot Co-op, but it was maybe a few thousand. And that was many more than I had ever expected. All in all, I considered the project a success.

The web development landscape has changed quite a bit since that time, and I have some new decisions to make, when it comes to how I’m going to put the site back together again. I suppose using Perl again is an option, but I’ve forgotten everything about that language.

I think for the second version (well, third, if you count Robot Co-op’s version) of the site, I’m going to split the front-end and the back-end into separate pieces. That’s how we build things at work and it seems to work well. So, given that, I needed to make a decision on which direction to go.

For the back-end, I was wondering if I should go with a JavaScript server like Express, or stick with what I know and use Ruby on Rails, and specifically using the new-ish Rails API-only functionality.

On the front-end, I was a bit less sure. I’ve already built a small React application, but at work I’ve primarily worked with Angular. At this point, I’d say I’m probably a bit more proficient with Angular, but I’d like to attempt a larger project with React. And then there’s also Vue.js, which looks interesting and has gained a lot of steam lately.

In the end, based on experience and ability to find online documentation and examples, I think I’m going to go with React for the front-end, and Rails for the back-end. I think I’ll be able to spin something up quicker with these than with any other technology.

Finally, for the original I stored all the data in the MySQL database, but I’ve been working with PostgreSQL the last six or so years, and I’m a little more comfortable with it.

~ ~ ~

Bringing back an old friend

I have a crazy idea. It may even be a bad idea.

I’m going to try and bring Lists of Bests back from the dead.

If you don’t know what I’m talking about, here’s the story. Back in early 2003, I had an idea for a website where you could keep track of books you’ve read, films you’ve seen, and albums you’ve listened to that happened to be on a list of greats (think The Academy Awards, the Pulitzer Prize list, etc). The result was Lists of Bests. It started out with maybe 10 lists, but over the years grew to over 30 lists of all kinds.

Then in 2006, I sold the domain and the site to the Robot Co-op group (old blog post), and they added a lot of neat features to the site and integrated it with their other properties. It was in good hands, and they kept it running for many years.

But at a certain point in the last decade, the Robot Co-op - and Lists of Bests - ceased to exist.

Now, flash forward to a few weeks ago when Namecheap had a sale on domain name registration, where .org sites were only a few dollars to register for a year. On a whim, I picked up (the .com domain seems to have fallen into hands trying to make a buck), and thought I’d try and bring the site back up… in a way. I guess the big question is this: why?!

I think it will be a good chance to a) learn something new, and b) give me something to write about here on the weblog. There are still many unanswered questions about this endeavour, but I’m going to give it a decent attempt. I don’t think it will end up being a fully functional site at the level it was before, but it could end up as a neat proof of concept. We’ll have to see where it goes.

I’ll likely be sharing the code, and I’ll definitely keep track of progress here on the blog. But who knows what I’ll end up with in the end. It could be fun, right? Wish me luck.

~ ~ ~

I think I'm going to like this book

Today I picked up Charles Stross’ 2004 book The Atrocity Archives from the library. So far, I’ve only read the introduction, but I really think I’m going to like this book. Here’s a small bit from Ken McLeod’s intro:

It is Charlie’s experience in working in and writing about the Information Technology industry that gives him the necessary hands-on insight into the workings of the Laundry. For programming is a job where Lovecraft meets tradecraft, all the time. The analyst or programmer has to examine documents with an eye at once skeptical and alert, snatching and collating tiny fragments of truth along the way. His or her sources of information all have their own agendas, overtly or covertly pursued. He or she has handlers and superiors, many of whom don’t know what really goes on at the sharp end. And the IT worker has to know in their bones that if they make a mistake, things can go horribly wrong. Tension and cynicism are constant companions, along with camaraderie and competitiveness. It’s a lot like being a spy, or necromancer. You don’t get out much, and when you do it’s usually at night [emphasis mine].

Previously, I’ve read Stross’ book Accelerando and liked it. But this book, and the ones that follow it in the “Laundry Files” series, have come up a few times in reading up on some other books I’ve been reading recently.

So, I was really looking forward to this book after reading that intro, and then I saw the book’s subjects in the front pages.

  1. Geeks (Computer enthusiasts)–Fiction. 2. Intelligence service–Fiction. 3. Office politics–Fiction. 4. Great Britain–Fiction. 5. Demonology–Fiction. 6. Monsters–Fiction. 7. Nazis–Fiction.

Yeah, I think this is going to be good.

2017-03-17 — #books #literature #scifi
~ ~ ~

What I'm doing

A while back I came across the “now page” idea that Derek Sivers started. While it’s not a totally unique idea, it is a nice exercise on keeping a public “to do” or “doing” list.

I’ve added one here and I’m going to do my best to keep it updated. It doesn’t feel like I have a whole lot going on at the moment, but seeing everything there in one place does serve as a nice reminder of the tasks I’ve told myself I should be working on.

I’ll try and keep that page up to date, but also use it as a reminder of things to make myself busy with when I find myself thinking “I’m bored.”

2017-01-02 — #motivation #now