GitHub Actions, GitHub Pages, the Sea of Galilee and a Twitter bot

Tzafrir Ben Ami
4 min readFeb 8, 2021
Image from Wikipedia

The Sea of Galilee is the lowest freshwater lake on earth, located in the northeast part of Israel. Up until few years ago the lake supplied around 25% of the country’s freshwater, but now days nearly half of the water supply actually comes from desalinated water and the country no longer depends on the lake. Nevertheless, maybe as a reminder from the past, many people are still following the lake water level measurements published daily by the lake authorities and frequently quoted by the mainstream media channels.

But so far with history\geography lesson… In this blog post I will show how I’ve used GitHub Actions to create a Tweeter bot that fetch the published daily measurements and tweet them. In addition, I also wanted a way to display and visualized past years measurements, and for that I’ve used GitHub pages to host and render the historical data.

The inspiration came from this post who use GitHub Actions to capture daily weather data, and its final outcome is available in my GitHub repository

Using GitHub Actions to fetch data

In a previous post I’ve discussed how to use babashka, a scripting language build on top of Clojure, with GitHub Actions. You don’t have to use babashka, of course, since GitHub Actions supports all the major programing\scripting languages so you can use your preferred one.

For the purpose of capture the water level measurements I’ve written a small babashka script that fetch the data from the Israeli government online data repositories API, and created a GitHub Actions workflow that calls this script and store its outcome in the project repository:

In line 4 you can see that the workflow is triggered once a day at 2PM UTC using POSIX cron syntax. The workflow itself is pretty straightforward and uses predefined Action to checkout the repository files into the workflow environment (lines 10–11) so we can access those files during workflow execution. The Download measurements step (lines 12–19) runs the script that fetch water level measurements and append it to a local file, following by another predefined Action to commit the locally modified file back to the repository (lines 25–33).

At the end of this workflow we can expect to have an updated daily list of measurements stored into a the workflow repository

Display historical data using GitHub Pages

GitHub pages allows you to host a static web site of files directly form your GitHub repository. You can create a static web site from the repository Settings section (there are some good online tutorials to follow with all the available options and howto). Being able to only host static HTML, CSS and JavaScript pages has its limitations, of course, but covers all that I need in order to visualize the Sea of Galilee historical water level measurements.

For the rendering the chart I’ve used Chart.js, a simple yet flexible JavaScript charting library. You do need a bit of JavaScript knowledge (which is basically all that I got) to use the charts library, but reading its documentation and some online examples should cover most of what you’ll need to know.

You can visit the measurements visualization page I’ve created here.

Sea of Galilee water level measurements screenshot

The Twitter Bot

The final piece of the puzzle is to create the Twitter bot that post daily tweet with latest measurement. In the GitHub actions marketplace you can already find public Actions for sending a Tweet directly from your GitHub Actions workflow. For this project I’ve used this Action but there are others in the marketplace.

You will need first to create a Twitter application in the Twitter developers portal and get the app API tokens\authentication keys for Tweeting using the Twitter API. It is recommended to store the API tokens as GitHub secrets which means they available to your GitHub Actions workflows but not visible as plain text for any one with access to the repository.

I could have simply add an additional step to the existing data fetch workflow for sending Tweets, but I wanted to separate concerns (what if in the future I want to fetch data several times a day but only Tweet once?) so I’ve created a new GitHub Actions workflow to handle the actual Tweets:

This workflow is also triggered once a day (line 4), checkouts the repository into the workflow environment (lines 9–10), execute a babashka script that creates the message to post to Twitter (lines 13–20) and finally sends the Tweet using a predefined Action (lines 21–29).

However, what if there is no message to Tweet? Maybe there are no new measurement available or the previous script failed for some reason? I don’t wanna Tweet an empty message, and for thus I’ve used GitHub Actions context and expressions syntax: the if condition in line 23 make sure that the step will be executed only if there’s a message to Tweet, and skip if the message is empty.

Sum it up

You can use this harmful project as a reference for creating your own Twitter bot using GitHub Actions. I had great time building it, despite the fact that at it has no “real” value in it. Sometimes just having fun and learning new stuff is the real value of your side project. Isn’t it?

--

--

a "Full stack technology leader", trying to solve complex business problems with technology - mainly on the Web and large-scale systems (but not just)