Using MoveOn for petitions

It's been a long time since I started a petition to try to change something in my world. But in recent weeks my local City Council has been threatening to do some silly things related to funding the development of bike and pedestrian paths here, and I'd heard enough people say informally that they were concerned by those threats that I decided it was time to create a central spot where they could put all their names for Council members to see.

And that's how I ended up using the petitions.moveon.org service, which has turned out to be excellent for this purpose.

A few things in particular that I like about it:

  • While clearly scaled to support national and state level petitions, the MoveOn tool did a great job of enabling a smaller petition targeted at a local legislative body that might not otherwise be in their system. I was able to enter the names and email addresses of my local Council members as "targets" of the petition, and they then were set up to receive deliveries of signatures directly.
  • Related to that, the MoveOn system allowed the targets of the petition to respond directly to the petition signers with a message, without giving them direct access to each others` private contact information.
  • The system automatically picks small signature goals to start with and then scales them up as new milestones are hit. I think this helps avoid the awkward "WE'RE GOING TO HAVE A HUNDRED MILLION SIGNATURES HERE!" declarations by petition creators that quickly yield disappointment.
  • The system offered up interesting summary stats about where signatures were coming from and what activity on the petition looked like over time.
  • When I had to contact MoveOn's petition support (one of the signers had accidentally left out a word in a comment that significantly changed the meaning, and wanted it corrected) they were fast to respond and provided a quick solution.
  • Other features in the petition tool, like handling delivery via print and email, contacting petition signers, "declaring victory," and more seemed really well designed; simple, effective, built for bringing about real action.
  • The petition application is open source and available for anyone to contribute to it.

One of the things I wanted to do as the signature numbers climbed and as I prepared to present the petition to Council was create something that visualized the signing names in one place. The signature count on the petition itself was not super prominent, and in only displaying 10 names at a time it was easy to miss out on the sense of a large part of the local population making a clear statement about what they want.

So I sniffed the XMLHttpRequests being made by the MoveOn site and found the underlying API that was being used to load the signature names. I whipped up a simple PHP script that queries that API to fetch all the names, and then does some basic cleanup of the list: leaving out anything that doesn't look like a full name, making capitalization and spacing consistent, sorting, etc.

I published the tool online at https://github.com/ChrisHardie/moveon-petition-tools in case anyone else might find it useful. (I later learned that MoveOn makes a CSV export of signatures and comments available when you go to print your petition, so that's an option too.)

Using the output of my tool, I created this simple graphic that shows all of the signed names to date:

All in all the MoveOn petition platform has been great, and I think it's made a difference just the way I wanted it to. I highly recommend it.

Gluing the web together with a personal webhook server

I mentioned earlier that I'm using a personal webhook server as glue between various online tools and services, and this post expands on that.

Why It Matters

For the open web to work well enough to bring people back to it, information has to be able to come and go freely between different systems without relying on centrally controlled hubs.

Just like RSS feeds, REST APIs and other tools that allow software programs to talk to each other across disparate systems, webhooks are an important part of that mix. Whereas many of those tools are "pull" technologies - one system goes out and makes a request of another system to see what's been added or changed - webhooks are a "push" technology, where one system proactively notifies another about a change or event that's occurred.

You might be using webhooks right now! Popular services like IFTTT and Zapier are built in part on webhooks in order to make certain things happen over there when something else has happened over here. "Flash my lights when my toaster is done!" And while I have great respect for what IFTTT and Zapier do and how they do it, I get a little uncomfortable when I see (A) so much of the automated, programmable web flowing through a small number of commercially oriented services, and (B) software and service owners building private API and webhook access for IFTTT and Zapier that they don't make available to the rest of us:

So I decided that just as I host RSS feeds and APIs for many of the things I build and share online (mostly through WordPress, since it makes that so easy), I also wanted to have the ability to host my own webhook endpoints.

How I Set It Up

I used the free and open source webhook server software. (It's unfortunate that they chose the same name as the accepted term for the underlying technical concept, so just know that this isn't the only way to run a webhook server.) I didn't have a Go environment set up on the Ubuntu server I wanted to use, so I installed the precompiled version from their release directory.

I created a very simple webhook endpoint in my initial hooks.json configuration file that would tell me the server was working at all:

[
  {
    "id": "webhook-monitor",
    "execute-command": "/bin/true",
    "command-working-directory": "/var/tmp",
    "response-message": "webhook is running",
  }
]

Then I started up the server to test it out:

/usr/local/bin/webhook -hooks /path/to/hooks.json -verbose -ip 127.0.0.1

This tells the webhook server to listen on the server's localhost network interface, on port 9000. When I ping that endpoint I should see a successful result:

$ curl -I http://localhost:9000/hooks/webhook-monitor
HTTP/1.1 200 OK
...

From there I set up an nginx configuration to make the server available on the Internet. Here are some snippets from that config:

upstream webhook_server {
    server 127.0.0.1:9000;
    keepalive 300;
}

limit_req_zone $request_uri zone=webhooklimit:10m rate=1r/s;

server {
    ...

       location /hooks/ {
           proxy_pass http://webhook_server;
           ...

        limit_req zone=webhooklimit burst=20;
    }

    ...
}

This establishes a proxy setup where nginx passes requests for my public-facing server on to the internally running webhook server. It also limits them to a certain number of requests per second so that some poorly configured webhook client can't hammer my webhook server too hard.

I generated a Let's Encrypt SSL certificate for my webhook server so that all of my webhook traffic will be encrypted in transit. Then I added the webhook startup script to my server's boot time startup by creating /etc/init/webhook.conf:

description "Webhook Server"
author "Chris Hardie"

start on runlevel [2345]
stop on runlevel [!2345]

setuid myuser

console log

normal exit 0 TERM

kill signal KILL
kill timeout 5

exec /usr/local/bin/webhook -hooks /path/to/hooks.json -verbose -ip 127.0.0.1 -hotreload

The hotreload parameter tells the webhook server to monitor for and load any changes to the hooks.json file right away, instead of waiting for you to restart/reload the server. Just make sure you're confident in your config file syntax. 🙂

After that, service start webhook will get things up and running.

To add new webhook endpoints, you just add on to the JSON configuration file. There are some good starter examples in the software documentation.

I strongly recommend using a calling query string parameter that acts as a "secret key" before allowing any service to call one of your webhook endpoints. I also recommend setting up your nginx server to pass along the calling IP address of the webhook clients and then match against a whitelist of allowed callers in your hook config. These steps will help make sure your webhook server is secure.

Finally, I suggest setting up some kind of external monitoring for your webhook server, especially if you start to depend on it for important actions or notifications. I use an Uptimerobot check that ensures my testing endpoint above returns the "webhook is running" string that it expects, and alerts me if it doesn't.

If you don't want to go to the trouble of setting up and hosting your own webhook server, you might look at Hookdoo, which hosts the webhook endpoints for you and then still allows you to script custom actions that result when a webhook is called. I haven't used it myself but it's built by the same folks who released the above webhook server software.

How I Use It

So what webhook endpoints am I using?

My favorite and most frequently used is a webhook that is called by BitBucket when I merge changes to the master branch of one of the code repositories powering various websites and utilities I maintain. When the webhook is called it executes a script locally that then essentially runs a git pull on that repo into the right place, and then pings a private Slack channel to let me know the deployment was successful. This is much faster than manually logging into my webserver to pull down changes or doing silly things like SFTPing files around. This covers a lot of the functionality of paid services like DeployBot or DeployHQ, though obviously those tools offer a lot more bells and whistles for the average use case.

I use a version of this webhook app for SmartThings to send event information from my various connected devices and monitors at home to a database where I can run SQL queries on them. It's fun to ask your house what it's been up to in this way.

For the connected vehicle monitoring adapter I use, they have a webhook option that sends events about my driving activity and car trips to my webhook server. I also store this in a database for later querying and reporting. I have plans to extend this to create or modify items on my to-do list based on locations I've visited; e.g. if I go to the post office to pick up the mail, check off my reminder to go to the post office.

I've asked the Fastmail folks to add in support for a generic webhook endpoint as a notification target in their custom mail processing rules; right now they only support IFTTT. There are lots of neat possibilities for "when I receive an email like this, do this other thing with my to-do list, calendar, smart home or website."

Speaking of IFTTT, they do have a pretty neat recipe component that will let you call webhooks based on other triggers they support. It's a very thoughtful addition to their lineup.

Conclusion

I'm not expecting many folks to go out and set up a webhook server; there's probably a pretty small subset of the population that will find this useful, let alone act on it. But I'm glad it's an option for someone to glue things together like this if they want to, and we need to make sure that software makers and service providers continue to incorporate support for webhooks and other technologies of an open, connected web as they go.

My everyday apps

This may be one of those posts that no one other than me cares about, but I've been meaning to take a snapshot of the software tools I use on a daily basis right now, and how they're useful to me. (This comes at a time when I'm considering a major OS change in my life -- more on that later -- so as a part of that I need to inventory these tools anyway.)

So every day when I get up and stretch my arms in front of my energy inefficient, eye-strain-causing big blue wall screen with a cloud in the middle, grumbling to myself that we now call things "apps" instead of programs or software or really any other name, I see:

Airmail 3 - my main tool for reading, sorting, sending and finding email across a few different accounts. Replaced Thunderbird with it a few years back, really good choice.

Chrome - on the desktop, my daily browser of choice. I tried the new Firefox recently and liked it, but not enough to switch. I have lots of extensions installed but some favorites are Adblock Plus, JSONView and XML Tree, Momentum, Pinboard Tools, Postman, Signal Private Messenger, Vanilla Cookie Manager, Xdebug helper and Web Developer. On mobile I use iOS Safari.

macOS and iOS Calendar and Contacts - works with all of the various online calendars and address books I sync to, have stayed pretty intelligent and user-friendly without getting too cluttered.

Todoist - for task management and keeping my brain clear of things I care about but don't want to have to remember.

Slack - for communicating with my co-workers and others. I have accounts on 11 different Slack instances, and only stay logged in to about 4 of those at once.

Continue reading "My everyday apps"

Creating a private website with WordPress

When we became parents in 2015, Kelly and I talked about where and how we wanted to share the initial photos and stories of that experience with a small group of our family and friends. In case you haven't noticed, I feel pretty strongly about the principle of owning our digital homes. So I felt resistance to throwing everything up on Facebook in hopes that we'd always be able to make their evolving privacy and sharing settings and policies work for us, while also trusting that every single Facebook friend would honor our wishes about re-sharing that information.

I took some time to explore tools available for creating a private website that would be relatively easy for our users to access, relatively easy to maintain, and still limited in how accessible the content would be to the wider world. (I tend to assume that all information connected to the Internet will eventually become public, so I try to avoid ever thinking in terms of absolute privacy when it comes to websites of any kind.)

I thought about using WordPress.com, which offers the ability to quickly create a site that is private and viewable only by invited users while maintaining full ownership and control of the content. I passed on this idea in part because it didn't allow quite the level of feature customization that I wanted, and partly because it's a service of my employer, Automattic. While I fully trust my colleagues to be careful and sensitive to semi-private info stored there, it felt a little strange to think of creating something a bit vulnerable and intended for a small group of people within that context. I would still highly recommend the WordPress.com option for anyone looking for a simple, free/low-cost solution to get started.

Here are the WordPress tools I ended up using, with a few notes on my customizations:

Basic WordPress Configuration

For the basic WordPress installation and configuration, I made the following setup choices:

  • I put the site on a private, dedicated server so that I had control over the management and maintenance of the site software (as opposed to a shared server where my content, files or database may be accessible to others).
  • I used a Let's Encrypt SSL certificate and forced all traffic to the SSL version of the site, to ensure all communication and access would be encrypted.
  • I set up a child theme of a default WordPress theme so I could add a few customizations that would survive future parent theme updates.
  • I set "Membership" so that "Anyone can register" in the role of Subscriber (see more below on why this is okay).
  • For Search Engine Visibility I set "Discourage search engines from indexing this site".
  • For discussion I set "

Continue reading "Creating a private website with WordPress"

Cloud email, contacts & calendars without Google

Tricky situationI like Google and a lot of the things it does in the world. When people ask me what free mail, calendaring and contact syncing tools they should use, I usually include Google's services in my answer. But I always explain that they're trading some privacy and ownership of their information for the "free" part of that deal. "You're the product, not the customer" and all that.

For me, I've always tried to avoid having my own data and online activities become the product in someone else's business model. There are plenty of places where I can't or don't do that, and I mostly make those tradeoffs willingly. But so far, I've been able to avoid using Google (and Apple and Microsoft) for managing my personal email, calendaring and contact syncing.

Here's how.

Continue reading "Cloud email, contacts & calendars without Google"

Chrome extensions to manage online privacy

Privacy

There are a couple of extensions for Chrome that I've been using for a while now to try to maintain or improve my privacy online. Some have been helpful, others haven't. Some mini-reviews:

Terms of Service, Didn't Read

Most every modern website has a "Terms of Service" that governs your interactions with it. The document usually lays out how and when the site will use any data it collects about you - helpful, right? The document is also usually many pages long and would potentially take hours to fully absorb and understand. Terms of Service, Didn't Read is an extension that tries to give you a high-level view of the Terms of Service of the site you're on, based on their team's reading and interpretation of those documents on your behalf. If there are particular concerns related to privacy and personal data use, the extension will flag that when you arrive.

I used this extension for several months, finding it interesting at first to see how the sites I visited regularly measured up to TOSDR's evaluation. But after the initial curiosity wore off, I realized that for the most part, the information here wasn't changing my behavior. If TOSDR flagged something like "The copyright license is broader than necessary" or "This service tracks you on other websites," I'd still have to do some more digging to figure out exactly what that meant, and whether or not I was comfortable with it. So, the information provided by TOSDR is helpful, but not always conveniently actionable when it comes to protecting privacy. (There's a theme in all this: protecting privacy is rarely convenient.)

Continue reading "Chrome extensions to manage online privacy"

Use the cloud, keep control of your data

Balloons in the Rose GardenAfter ranting recently about the choices we make to give "big data" companies access to our private information in ways that might be abused or exploited by government eavesdroppers, I thought it would be worth sharing some of the options I've found for using "the cloud" while also retaining a reasonable level of control over access to the data stored there.

This post has information about tools and software you can deploy yourself to approximate some of the functionality that third party services might provide, but that might also make you vulnerable to privacy and security vulnerabilities.  It's based on my experiences designing and implementing solutions for my own company, so it's mostly applicable to the interests of businesses and organizations, but may also be useful for personal projects.

A few important disclaimers: any time you make your personal or corporate data available on Internet-connected devices, you're creating a potential privacy and security vulnerability; if you need to keep something truly protected from unauthorized access, think hard first about whether it belongs online at all.  Also, the tools and services I'm listing here are harder to setup and configure than just signing up for one of the more well-known third party services, and may require ongoing maintenance and updates that take time and specialized knowledge.  In some cases, it requires advanced technical skills to deploy these tools at all, which is the reason most people don't or can't go this route.  Hosting and maintaining your own tools can often have a higher initial and/or ongoing cost, depending on what financial value you assign to data privacy.  Sometimes the privacy and security tradeoffs that come with using a third-party service are well worth it.

Still interested in options for using the cloud without giving up control over your data?  Read on.

Email and Calendar Sharing

Need a powerful, free email account?  Need robust calendar management and sharing capabilities? Everybody uses Gmail and Google Calendar, so just sign up for an account there, right?  Unless you don't want Google having access to all of your email communications and usage patterns, and potentially sharing that information with advertisers, government agencies or other entities.

Continue reading "Use the cloud, keep control of your data"

1Password alleviates the horrors of password management

1PMainWindowI come to you today a recovering password management hypocrite.

I have over 190 accounts and logins for which a password or PIN is a part of my access: website tools, online banking, social media, email, internal company tools at Summersault, and so on.  I used to pretend that I was maintaining the security of these accounts by having a reasonably strong set of passwords that I re-used across multiple sites, sometimes with variations that I thought made them less likely to be broken into if someone did happen to compromise one of my accounts.

But as I prepared to give a talk in December about email privacy and security issues, and really stepped back to look at my own password management scheme, I realized just how much pretending I'd been doing, and just how vulnerable I was making myself to the increasingly well-equipped and highly-automated attempts at compromising accounts, stealing identities and stealing funds that are being launched every day.  I went and tested some of my passwords at the Password Strength Checker, and I was ashamed.   The potential impact of this really hit home as I read Mat Honan's personal tale of woe and his follow-up piece Kill the Password in Wired magazine.  Add in Passwords Under Assault from ArsTechnica and you'll be shaking in your boots.

So I decided that I was not going to be that guy who goes around telling people about how vulnerable they are with their simplistic password schemes while quietly living a lie in my own password management scheme.  I might still be hacked some day, but I would not be found giving some teary-eyed interview to Oprah where I whined about how the pressure of the 190 accounts to manage just got to be too much and how I knew using a simple dictionary word plus a series of sequential numbers was wrong but I still didn't do the right thing.

That's when I found 1Password from AgileBits, a password management tool that alleviates the horrors of password management.

Continue reading "1Password alleviates the horrors of password management"

Replacing Notifo with Pushover

Two years ago I compared Notifo and Prowl as tools for sending custom push notifications to your mobile devices.  I ended up relying on Notifo quite a bit to send me mobile alerts about certain kinds of events that I might not otherwise notice right away - email messages from certain people, some kinds of calls or voicemails at my office, certain messages meant for me in the office chat room, etc.

(You might think all that alerting would get obnoxious, but having these notifications sent to me according to my preferences has meant I'm less likely to obsessively check email or other digital inboxes for something important I might be missing.  The good/important stuff gets to me fast, the rest waits for me to view it at my convenience.)

In September 2011, the creator of Notifo announced that he would be shutting down the service.  It's continued to mostly work since then without his intervention (a testament to the self-sufficient nature of what he created), but in the last few weeks I've seen increasing errors or delays in getting messages through, so I went in search of alternatives to Notifo.

Today I found Pushover, a really simple but elegantly done service that offers all the features I want.

Continue reading "Replacing Notifo with Pushover"