Over the past few years I’ve gotten into the unhelpful habit of checking
Twitter search to see if folks have mentioned my writing. I don’t actually do
anything with that though, beyond perhaps leaving a “like”.
I enjoy using Twitter, but this part of how I use Twitter is just an unhelpful
habit to waste time, so I wanted to try automating it away.
I got started by creating a new Twitter Developer account,
and then wrote up a simple script, github repository, and github action in lethain/social-context.
You can see the key pieces at:
- retrieve.py: script that calls Twitter API, parses it, and write each results to a text file
- .github/workflows/scrape.yml: configuration for Github action to run this script once each day
- If you go to “Settings”, “Secrets” and then “Actions”, you would see that I’ve added a repository secret
BEARER_TOKEN that is my Twitter API Bearer Token issued by Twitter
The search queries are driven by L9-L13 in retrieve.py:
WEBSITES = (
('lethain.txt', 'lethain.com -from:lethain_bot -RT'),
('staffeng.txt', 'staffeng.com -RT'),
('infraeng.txt', 'infraeng.dev -RT'),
If you wanted to run this for your own search queries, you’d just replace the tuples with the file to write into
and the Twitter search query you want to run.
If you open up one of those files like lethain.txt, you’ll see entries like:
Staff engineer reflects on overwork and expecting better from peers https://t.co/FiebOfkf2k
Altogether, this works well! A new commit lands if there are new tweets, and otherwise nothing happens.
I have it set to run every day because I’m trying to check less, but you could certainly run it more frequently.
The only annoying bit is figuring out how to send a useful notification.
Right now I’m getting email notifications when new commits occur (e.g. only when there are new search results),
but the emails don’t include the actual changed files, so I’ll have to keep iterating on this a bit.