I recently made two space-themed posters, and I think they turned out pretty nicely! The first one that I made was honestly a bit of an accident. My initial goal was to create a sunflower, so I watched some YouTube videos on how spirals work, and I also read some codeonline to learn how they’re implemented as a program.
But as I played around with different spiral styles and backgrounds, I decided that it was better to go with this galaxy/wormhole poster instead.
I was really happy with how this poster came out, so I thought it would be fun to continue working on space art. A planet felt like a good next step, so I made this gas giant poster next. The planet itself is a circle with a gradient fill, and the rings are ellipses with a gradient stroke.
There’s something fascinating about creating art by simply specifying a set of rules for the computer. There are so manypossibilities! I’m hoping that I’ll continue exploring generative art for the years to come. I’ve been collecting my favorite works in https://jagtalon.com/generative-art/ as well so check that out from time to time if you’re interested!
I recently had some of my generative art printed, and I’m stoked to finally have them in my hands! It feels surreal to see my creative coding projects on my wall, taking up space in the physical world instead of being mere pixels on the screen.
It’s surprisingly straightforward and affordable. I first adjusted <canvas> to be 3600×5400 pixels or 12×18 inches at 300 dpi. Whenever I see a piece that I like, canvas-sketch lets me quickly export the image by pressing Ctrl-S.
I didn’t know where to get it printed at first, so I had one printed at Staples. Unfortunately, the colors looked a bit dull, and they weren’t as vivid as I had hoped. It was also $10 a page, so it was a bit expensive. Fortunately, a friend of mine told me about a shop right here in Philadelphia called Fireball Printing.
They have a minimum of 5 pages per print job, but I was surprised that it only cost me $3.70 for all 5 prints! The colors also looked unbelievably accurate to me.
I’ve been hoping to sell physical prints on Etsy, but right now, I’m uncomfortable with picking up, packing, and mailing art when there’s a pandemic. So, for now, I have a 🛍 digital-only shop. Check it out!
I was put off by Tarsnap Backup when I ran into it many years ago. I liked that it’s trusted and secure enough to be used by companies like Stripe, but I just didn’t want to put in the work to modify configuration files, write shell scripts, and set up cron jobs to backup my files. That felt like too much effort for me to manage. I have a laptop, not a server.
But over time, my backups have become a bit of a fragmented mess. I ended up adopting multiple services (and overpaying for storage) because they were easy and tightly integrated with my devices. I have iCloud for my photos, OneDrive for my documents, and Backblaze for everything else on my computer.
I wanted to consolidate everything I had into a single place, and I wanted it to be cheap and secure. Tarsnap came to mind again because it’s inexpensive enough for my needs, and I like that I’d be the only one holding the encryption keys. (It costs $0.25 per GB, so it’s not the cheapest one out there, but deduplication has done wonders for me.)
This is how I have it set up on Windows through WSL.
I don’t want it to make a backup of my whole drive because that will take forever to upload on my 5Mbps network. Even though Tarsnap is affordable, it could get expensive if I backup unnecessary program files.
So in this script, I created a variable called $directories in the backup script that lists out the folders I care about in both Windows and Linux.
I want the backup to run automatically in the background, so I had to set a cronjob. I typed in crontab -e and set Tarsnap to run every half hour (4pm, 4:30pm, 5pm, etc.)
*/30 * * * * /home/jag/tarsnap-backup.sh
I haven’t run into this, but I’ve read that cron can fail silently. So I had Dead Man’s Snitch set up as well to warn me if my cronjob didn’t run in the last 24hrs just to be extra safe. They have a program called a Field Agent that monitors execution time and output of the backup script.
This part gets pretty hacky. The problem is that the cron service doesn’t start automatically in WSL 2. Even though I have cron configured, it’s not going to execute unless the cron service runs in the background. Not having a normal boot process means that services in Linux have to be started manually, but I didn’t want to run the backup myself every time I boot up! I’d have to run the following manually:
Open Windows Terminal
Start Windows Subsystem for Linux by typing wsl or ubuntu
Run sudo /etc/init.d/cron start
Type in my password
I needed to make all these steps automatic. First, I removed the need to put it in a password when running cron by adding this to the sudoers file (which you can get to by typing in sudo visudo):
jag ALL=ALL NOPASSWD:/etc/init.d/cron start, /etc/init.d/cron status
Now that I can start cron without a password, I can get it to run in the background without a password prompt when I log in to WSL. I added this condition to config.fish (The .bashrc on Fish Shell):
# Start cron if it's not running.
if sudo /etc/init.d/cron status | grep 'is not running'
sudo /etc/init.d/cron start
Now that I’m able to start cron automatically, all I needed to do is to run WSL when I log in on Windows. Pressing Win+R and typing in shell:startup took me to the startup folder. In it, I wrote wsl.bat that just had ubuntu.exe in it.
Easier Tarsnap commands
I have trouble remembering Tarsnap commands, so I made some aliases to make it simpler for me. I wrote this function in ~/.config/fish/functions/tarsnap.fish:
command tarsnap --print-stats --humanize
command tarsnap --list-archives | sort | tail
command tarsnap -xf $argv[2..-1]
command tarsnap -tf $argv[2..-1]
command tarsnap $argv
This means that I could just type in tarsnap stats instead of tarsnap --print-stats --humanize and tarsnap archives instead of tarsnap --list-archives | sort | tail. Much easier to work with!
Was it worth it?
That was quite a bit of work to figure out and set up—especially on Windows! There should be fewer (and less hacky) steps on macOS or Linux, but I’m glad that it’s all running smoothly for me. There’s nothing else for me to do aside from making sure that it’s running. It should also be easier to set up in a new machine now that I know how to get it to work.
So was it worth it? I think it is. The reason I did this in the first place was that Tarsnap was affordable and more secure than anything I’ve used before, so I think I still come out ahead even after all that work. We’ll see if I change backup preferences in the future, but for now I think I’m content with what I have.
I love the wave-like patterns in these images. I recently made these using <canvas> (which I’m learning about in the Creative Coding workshop that I’m taking), and each shape that you see is a Unicode character.
Their positions are evenly distributed on a grid while their sizes and rotations are generated using the noise2D function. I played around with the different variables (rotation, frequency, amplitude, size, and the number of elements), and it made these wonderful works of art.
I’d love to be able to print some of these so that I could hang them up on my wall.
The Canvas and WebGL workshop that I’m taking right now is taught using macOS, so the course assumes that I can access some Unix tools and commands in the dev environment. Since Windows isn’t built on top of BSD or Linux, I have to do a bit more work to get up and running.
Install Windows Subsystem for Linux
Windows Subsystem for Linux or WSL is the best way to get a Linux environment running on Windows. It’s a lightweight virtual machine that runs unmodified Linux binaries on top of Windows. It’s so fast that it doesn’t feel like I’m using a VM at all.
There are two versions of WSL, so I made sure to get WSL 2 because it’s 2-5x faster when it comes to IO operations like git clone and npm install—commands that I’ll often be using!
Install Remote – WSL Extension on VSCode
The Remote – WSL extension on VSCode forces all the commands to run in the Linux VM. It makes sure that I’m interacting with Linux all the time while I’m in VSCode. So even though the editor is running on Windows, programs like Git or Node are all running in Linux.
Bonus: Install Windows Terminal
I’m a fan of good-looking terminals, and the built-in terminal on macOS is beautiful. It looks modern, and it supports tabs and theme customization. The built-in Command Prompt on Windows, on the other hand, looks quite ancient.
Fortunately, Windows Terminal is available in the Microsoft Store, and it supports tabs, vertical splitting, and even emojis. I don’t know why this isn’t included in the OS in the first place!
I’m learning how to draw randomized shapes on the <canvas> right now, and love that I can create beautiful images by simply randomizing the xy-coordinates, the radius, and the color (using the excellent nice-color-palettes).
I can imagine this being useful for infographics, too. The color could represent a data point (like an app), the size could represent a property (like how often it tracked you or how much data it uses), and the xy-coordinates would be randomized in the image.
Now that I’m a full-time designer, most of my days are either spent planning projects in Asana or working on them on Figma. I hardly ever do any programming nowadays, and I miss the feeling of writing code and bringing websites to life.
I’ve been thinking about small programming projects that I could do in my spare time, and I recently ran into a creative coding course on Frontend Masters. I’ve spent a lot of time making websites and creating user interfaces, but I never thought about using programming to create art. The visual aspect of it is enticing, and I love the fact that it’s can be a creative outlet for me.
I’m early in the course, and right now, I’m going through the Canvas API. We’re on the topic of creating grids at the moment, and I’m excited about the idea of creating digital art that I can actually hang on the wall.
I’m enjoying it so far, and I’m thrilled to continue making art through code. I’ll make an effort to blog as I go through the course!
Repairability has become a big part of my decision making when I buy new devices these days. I don’t know much about electronics, but learning to make minor repairs on my iPhone, Android, and Roomba made me realize that I could easily give new life to old and hobbling tech. It felt good to keep something going, and I’m hoping that this will help save me money and reduce my e-waste in the long run. So, now that it’s time for me to find a new laptop, I wanted to make sure that there was a chance for me to at least swap out a few components if I needed to.
I’ve been a Mac user for many years now, so I thought it made sense for me to do some research on the latest MacBook Pro laptops first. How repairable are they? It would be great if I could avoid switching OSes in the first place.
I found out that the new MacBook Pros only scored an abysmal 1 out of 10 on iFixit’s reparability rating. Apparently, they’ve all had terrible scores since 2012. They have everything soldered and glued together which meant that I wouldn’t even be able to upgrade the drive or RAM if I wanted to. It was clear that I was going to have to leave macOS if I’m adamant about getting a repairable laptop. (There are repairable Macs out there, but they’re desktops and not laptops. The Mac Mini scored a 6 out of 10, and the Mac Pro scored an amazing 9 out of 10! But even if I wanted a desktop, a Mac Mini is too underpowered for work, and the Mac Pro is almost too excessive.)
Trying it out
I wasn’t sure if I could even pull off moving to another OS. I knew that I wouldn’t be able to use Linux for work, but could I move to Windows? It’s been 13 years since I last used one.
I thought that the best way to know was to try it out. So, I committed to using Windows 10 on Bootcamp for a week to see if it was any good. I told myself that if it didn’t work out, I could just get a Mac like I always did.
Most of the apps I use are fortunately cross platform. Figma Desktop, Krisp.ai, TiddlyWiki, VSCode, and Zoom all work the same way on Windows. For everything else, I had to do some work to find alternatives. Here’s where I moved to:
I’m sad that nothing really comes close to a Fantastical alternative on Windows. The built-in calendar is fine to manage both my personal and work calendars, but it’s missing a lot of the features that I use. It doesn’t have calendar groups/sets, and it doesn’t understand natural language. Typing in “Order chicken wings every Friday at 7pm” just won’t work on Windows.
The repairable laptop
I was convinced that Windows was good enough for my day-to-day work, so I went ahead and got a Dell XPS 17. Its slightly smaller counterpart got a 9 out of 10 on iFixit, and it’s got some good reviews online, too. I’ve been using it for a week now as I’m writing this, and I’m loving it.
I’ll still have my old MacBook Pro around to use from time to time (looking at you, Principle and Rotato), so I won’t be 100% on Windows. There’s also a lot to look forward to on the Apple side. They recently released their environmental progress report, and they talk about making it easy for people to repair their devices. I hope to one day see more repairable laptops from them.
During Hack Days last month, Robert and I thought it would be fun to make a Figma plugin that we could use while designing. We had many ideas, but we ended up settling on the idea of a plugin that fetches website metadata. We wanted a plugin that could fetch Open Graph information about a podcast, a song, or a news article, and we wanted to make it easy for people to import that data into Figma.
Since Figma only runs on web browsers (even though it’s written in C++!), the plugins have to behave like any other website. For us, this meant that the plugin we’d have to follow restrictions like the same-origin policy. It means that we can’t just fetch and scrape the website’s metadata from Figma since they likely won’t have the right CORS headers set.
For the Open Graph plugin to work, I had to set up an intermediary server that would scrape the website on behalf of the plugin. I’m not good at writing backend code, but I managed to write a simple server that was good enough for me to work with. You can find the server code on GitHub.
The server is far from done, and there are definitely many edge cases that I haven’t come across yet. What happens when a website blocks me? How long should I cache the page? What if there are no meta tags on the page? Are there any security implications for the Figma user?
There’s also the problem of actually running this server. I don’t think I’d want to run and maintain a VPS, so I’m thinking of exploring the possibility of converting it into serverless code. I’m thinking of looking into Cloudflare Workers soon.
I expected the UI for this plugin to have a fair amount of re-rendering, so I wanted to learn React before jumping into this project. Unfortunately, I couldn’t find the time to learn React, but I did find a wonderful templating library called lit-html. It was easy for me to understand, and it got me up and running immediately.
The plugin code is at the point where it can fetch data from the server and import images into the Figma document. It was a big win for me, especially since working with images is a bit complicated in Figma because of its architecture. I think the next step for me is to make this plugin more robust. What should it do when the server or network is down? Will this run on all major browsers? What kinds of errors will the plugin encounter? I have to address all of these before releasing this to everyone. You can find the code on GitHub, too.
It was fun!
It felt good to take a short break from design and do some frontend work. I’ve been feeling a bit rusty, but it was good to know that I can still write programs from time to time. I only got to work on it for a week, but hopefully, I can find the time to get this to the finish line.
I’ve been using TiddlyWiki for two months now, and I’m amazed at how versatile this piece of software is. Since it’s able to weave together all the disparate parts of my life, it’s become my go-to notebook for almost anything. I’ve become dependent on it for documenting and understanding my life.
But if TiddlyWiki is going to be my notebook for everything, I have to get it to work on my phone, too. I’ve been hesitant about it because I assumed that it would be a pain to work on WikiText without a keyboard, but it didn’t end up being as bad as I thought it would be. I also like that I get to look things up on my wiki while I’m away from my computer.
The first thing that I needed to do to get set up on my phone was to get an app called Quine. Even though TiddlyWiki is just an HTML file (there’s no server component here), saving can be a bit of a pain because browsers don’t allow websites to write directly into files. Quine is a modified browser specific for TiddlyWiki that lets you bypass that restriction.
Next, I needed to figure out how to sync the wiki between my computer and my phone. The simplest solution is to drop the wiki into a service like iCloud, Dropbox, or Resilio and call it a day. Any changes that I make on my phone would automatically sync with my computer and vise versa. If you’re thinking of doing this yourself, this might be good enough for you! Dropbox even lets you recover old versions of the file if you ever needed to revert back to them.
But I wanted more control over my backups. I want to have the freedom to tweak the code and try out new themes and plugins without worrying about breaking my wiki. I wanted version control. I wanted Git on my phone.
I didn’t know if it was even possible to use Git on iOS, but I did a quick search and found Working Copy. All I had to was to link my GitHub account and point it to the right repository. GitHub now offers unlimited private repos for free, so I’ve taken advantage of that for backing up my wiki. Working Copy downloads the repo to my phone, and it makes the files available to third-party apps like Quine.
While this is not as convenient as using Dropbox, it makes me feel more confident about my wiki’s integrity in the long run. I’m excited about this setup, and I hope I can keep this going in the years to come.