Don’t Worry A-Bot It: Part 1

Share on:

FacebookTwitterTumblrLinkedInRedditPocketEmail this page


Welcome to the Don’t Worry A-Bot It series here at Technically Speaking, where I’ll be taking a deep dive into some of the various bots I’ve been working on. Some of these serve a real purpose, others are just fun proof-of-concepts, but I’m proud of each one of them, and what I’ve learned about different platforms, APIs and programming languages (mostly Python) along the way. For those wanting to learn more, check my code or perhaps even host it yourself, check out my GitHub page.

Without further ado, let’s get into it with two of my earlier bots, UGA Editing and Speed Complainer.

When I started exploring bots, the first thing that came to my mind was Twitter. For better or for worse, the platform is brimming with automated accounts, from dangerous foreign propaganda to harmless, and clever, bots that track new words in the New York Times, follow count documents or just tweet out a new color every few hours.

One particular bot I came to admire was (the since-unfairly-suspended) CongressEdits, which would monitor Wikipedia pages for anonymous edits made from IP addresses associated with the U.S. Senate and U.S. House of Representatives. On Wikipedia, if someone chooses not to sign into an account when making an edit, the edit will be marked with the IP address where the edit came from. Knowing that a particular IP address belongs to the Capitol Building in Washington, D.C., you can assume that the edit was likely made from inside the building (or at least on their network; what’s the WiFi like in the Capitol?). If an edit was detected, the bot would tweet it out almost immediately, with a link to the edit.

Admittedly, aside from some funny hijinks from congressional interns that caused the bot to tweet out edits about random Wikipedia pages, and even eventually about edits made to its own Wikipedia page, most of the time the edits were harmless and minor. The bot never really caught anything scandalous or incriminating. Unfortunately, after staffers decided in 2018 to dox Republican senators during the Brett Kavanaugh Supreme Court appointment hearings, the bot unwittingly publicized the edit that included the senator’s personal phone numbers and addresses. Twitter then decided to suspend the account for violating the platform’s terms of service.

My thoughts on that aside, even before that incident, I was interested in creating my own version. There are probably 50 or so clones of CongressEdits on Twitter today, thanks to Ed Summers, the bot’s developer, publicizing his “anon” source code on GitHub. Many of the clones monitor everything from the NYPD to foreign governments to a handful of U.S. colleges. I initially intended on making one to monitor the Atlanta city government, knowing their recent history with scandal, but unfortunately, I could not figure out how to easily determine their IP addresses. I know what the IP address of their website is, but who’s to say their website is hosted on the same network?

So, instead, I decided to monitor edits made at my college, The University of Georgia. At the time, there were only one or two other college monitoring accounts (I’d like to think I inspired others), but a college seems like an obvious place. Between professors updating pages with new research, students possibly editing pages to alter facts for their projects, or even administration trying to scrub negative information, I felt there was potential for the bot.

While I’m not too good with Node.js, Ed Summers thankfully made forking and running the code simple, with only minor edits needed, including Twitter API keys and whitelisted and blacklisted IP ranges and pages. So, late one night, I drove down the street to campus, wandering around a couple of places, checking my current IP addresses on my phone and cross-referencing those with a reverse-IP lookup site, until I discovered the two main IP ranges associated with UGA’s PAWS-Secure WiFi network. I was able to pop those into the configuration file, and soon the bot was up and running off my Raspberry Pi.

Months later, I made some minor modifications to the configuration file and the source code to clean up some syntax and allow the bot to monitor and tweet both: 1) any change to any page by an IP address from campus, and 2) any change to specific UGA-related pages by anyone, anonymous or not. This has lead to not only professors updating random pages with research, but also a lot of smack talk and vandalism, particularly on football-related pages, being publicized.

I’ve since also visited the IRC chatroom where all Wikipedia edits are posted, and where all these monitoring bots pull their data from. I have to say, it was exciting to see UGAediting (UGAedits was already taken) right there alongside its bigger brothers.

The bot is still running off my Raspberry Pi 3, and is still active at @UGAediting on Twitter.

Around the same time, I learned about another clever Twitter bot that a user had made to test and log his home internet speeds, and tweet at his provider if his speed dropped below a certain threshold of what he was paying for. I can’t remember if I was facing issues with my internet speeds and had a true need for this bot, or if it was just another interesting jump into bots (possibly a bit of both), but I jumped at the code.

While this was in Python, I was unfamiliar with the language at the time (plus, I was trying to combine it with a similar speed test bot to upload the test results to a Google spreadsheet). Thankfully, this was also relatively simple to set up, with just updating Twitter credentials and basic configurations like the text to tweet out, who to tag in the tweets and what thresholds to trigger a tweet. At the time, I was paying for 100 Mb down from Spectrum, and so I set the bot to trigger at 75 Mb or less. That caused the bot to go off nearly every time it checked every 15 minutes. I eventually had to drop the threshold down to 60 Mb or lower to keep it from firing off so often (unlike UGAediting, the tweets came from my personal account; I’m sure Spectrum’s support team hated me after a while). Admittedly though, it’s really sad that I had to scale back my speed complainer because my internet speed so frequently was much slower than what I was paying for.

Eventually, due to conflicts of interest, I deactivated the bot, and have since switched from 100 Mb down with Spectrum to 1 Gb parallel with Google Fiber. Complaining about only getting 750 Mb when I pay for 1 Gb would seem snooty, so this bot is probably going to stay off for the time being (plus, the Pi 3’s 100 Mbit Ethernet port would constantly return false results and incorrectly trigger a tweet). I do still have the second speed test program running, checking my speed every hour on the hour and logging the results into the Google spreadsheet.

Download speeds (Mbits) from Google Fiber at my apartment in the last week (the Raspberry Pi 3 is limited by a 100 Mbit Ethernet port, but clearly, it’s often near peak)

I particularly appreciate the results from when I moved from 100-Mb up/20-Mb down to 1-Gb up/1-Gb down. See if you can spot the moment in the graph below. Again, because of the Pi’s hardware limitation, the download speed isn’t too noticeably different, but the upload speeds just from approximately 5 or 10 to close to 90. With a gigabit port (like the one on the new Pi 4), these results would have likely jumped twenty times over.

There’s not much difference in the download speeds (Blue) because of the Pi’s 100-Mb Ethernet port, but going from 5 Mb up to 90 Mb up (Red) was a nice change of pace.

Because this was my first foray into the world of bot development, neither of these projects were built from scratch like my later ones. But, thanks to open source code, helpful developers and simple setups, they both gave me a great introduction into programming, Twitter’s API packages, setting up a Twitter account and applying for credentials, and how to run repeating scripts using cron jobs and start scripts at boot on a Raspberry Pi.

Thanks to Ed Summers at the University of Maryland for developing “anon,” and to James Atkinson for developing “speedcomplainer.”

Next time: More Twitter bots, this time, built from scratch.

Author:

Leave a Reply


Your email address will not be published. Required fields are marked *