I’ve been working on this app which relates to my obsession with color. It’s an image processing app, and you can see some pictures made with it on our Tumblr.
This involved learning about how to take images apart and put them back together, rewriting a lot of stuff in C for performance, etc. But one of the other problems I faced was a question of how to test things involving images? How do I create test images? And how do I compare them?
Creating Test Images
The simplest way to do this is to to draw the image into context. This is super not performant, so isn’t really viable for much other than small test images, but does work.
I have three little helper functions that create some test images that I can work with.
The other thing I have is a function that turns an array of UIColors into an image. This is a bit more complicated, but helpful for some tests.
This leads me to the question of comparing images. For my purposes (and the app is heavily focused on colors), I can determine if things have worked by comparing two color arrays. I could compare the rawData but I want to abstract it away a bit to make my tests clearer. So I have another function that is basically the inverse of the one above, which extracts an array of pixels from an image.
Turning images into arrays of UIColors and vice versa is so-so performance-wise, and UIColors have a huge space overhead compared to the rawData array. It’s fine for testing, for very small images or a proof of concept, but not much more than that.
Then with two arrays I can just loop through and compare.