brhfl.com

Honey walnut, please

Apple recently stirred up a bit of controversy when they revealed that their bagel emoji lacked cream cheese. Which is a ridiculous thing to get salty over, but ultimately they relented and added cream cheese to their bagel. Which should be the end of this post, and then I should delete this post, because none of that matters. But it isn’t the end, because I saw a lot of comments pop up following the redesign that reminded me: people really don’t seem to get how emoji work. Specifically, I saw a lot of things like ‘Apple can fix the bagel, but we still don’t have a trans flag’ or ‘Great to see Apple put cream cheese on the bagel, now let’s get more disability emoji’. Both of those things would, in fact, be great1, but they have nothing to do with Apple’s bagel suddenly becoming more edible.

Unicode is, in its own words, “a single universal character encoding [with] extensive descriptions, and a vast amount of data about how characters function.” It maps out characters to code points, and allows me to look up the division sign on a table, find that its code point is 00F7, and insert this into my document: ÷. Transformation formats take on the job of mapping raw bytes into these standardized code points – this blog is written and rendered in the transformation format UTF-8. Emoji are not pictures sent back and forth any more than the letter ‘A’ or the division sign are – they are Unicode code points also, rendered out in a font2 like any other character. This is why if I go ahead and insert 1F9E5 (🧥), the resulting coat will be wildly different depending upon what system you’re on. If I didn’t specify a primary font for my site, the overall look of this place would be different for different users also, as the browser/OS would have its own idea of a default serif font.


What pros?

When my Mac Pro recently slipped into a coma, I began thinking about what my next primary computer will be. Until this past week, Apple hadn’t updated any Macs in quite a while1, and the direction they’ve taken the Mac line continues to puzzle me. It all started (in my mind) with ubiquitous glossy screens, and has worked its way down to touchbars and near-zero travel keyboards. Last week’s update to (some) Macbook Pros is welcome, but underwhelming. Six cores and DDR4 is great, but that’s only in the large model. Meanwhile, if I wanted to suffer through a 15″ machine, HP’s ZBook 15 has either hexacore Xeons or Cores, inbuilt color calibration, a trackpoint, a keyboard that I feel safe assuming is superior to the MBP’s, and a user-upgradable design.

I remain consistently confused by what professionals Apple is targeting. As a creative user, I’d whole-heartedly prefer the aforementioned HP. Most illustrators I know rely on Surfaces or other Windows machines with inbuilt digitizers. I know plenty of professional coders on MBPs (and Apple seems to push this stance hard), but I don’t know why23 – that funky keyboard and lack of trackpoint don’t make for a good typist’s machine. The audio world makes sense, Logic is still a big deal and plenty of audio hardware targets the platform. But honestly, when I see people like John Gruber saying the updated MBP’s “are indisputably aimed at genuine ‘pro’ users”, I’m a bit baffled, as I simply can’t think of many professional use-cases for their hardware decisions as of late. They’re still extremely impressive machines, but they increasingly feel like high-end consumer devices rather than professional ones.


Brief thoughts on the iMac Pro

Yesterday, Apple announced the iMac Pro, an all-in-one machine purchasable with up to an 18-core Xeon processor. I can’t tell if this is a machine for me or not (I love Xeon Macs but not iMacs so much), but I also have no real reason to think about that beyond fantasy – I’m only on my 2nd Xeon Mac, and I expect to get a few more years out of it. They age well. The current, oft-maligned Mac Pro smashed an impressive amount of tech into a rather small, highly optimized space. It may lack the expansion necessary for typical Pro users, but it is a technological masterpiece. The new iMac, however, seems like an impossible feat1.

What truly excites me is the reinforcement that Apple is committed to its Xeon machines. The iMac Pro is not the mysterious upcoming Mac Pro. So while tech pundits have lamented the inevitable death of the Mac Pro in recent years, Apple has instead doubled down and will be offering two Xeon Macs rather than zero.

One final thought that is more dream than anything – Apple prides itself on its displays, and on its Pencil/digitizer in the iPad Pro. A lot of artists use pro software on iMacs with Cintiq digitizers. Cintiqs are top-of-the-line, but that doesn’t make them great. The digitizers are decent, the displays themselves are alright, but they aren’t spectacular devices – they’re just the best thing out there. I don’t expect Apple to move to a touch-friendly macOS, their deliberate UI choices show that this is a clear delineation between macOS and iOS. But I think working the iPad Pro’s Pencil/digitizer into an iMac2 could very well prove to be a Cintiq killer for illustrators, photographers, and other visual artists.


Darwin image conversion via sips

I use Lightroom for all of my photo ‘development’ and library management needs. Generally speaking, it is great software. Despite being horribly nonstandard (that is, using nonnative widgets), it is the only example of good UI/UX that I’ve seen out of Adobe in… at least a decade. I’ll be perfectly honest right now: I hate Adobe with a passion otherwise entirely unknown to me. About 85-90% of my professional life is spent in Acrobat Pro, which gets substantially worse every major release. I would guess that around 40% of my be-creative-just-to-keep-my-head-screwed-on time is spent in various pieces of CC (which, subscription model is just one more fuck-you, Adobe). But Lightroom has always been special. I beta tested the first release, and even then I knew… this was the rare excuse for violating so many native UI conventions. This made sense.

Okay, from that rant we come up with: thumbs-down to Adobe, but thumbs-up to Lightroom. But there’s one thing that Lightroom has never opted to solve, despite so many cries, and that is PNG export. Especially with so many photographers (myself included) using flickr, which reencodes TIFFs to JPEGs, but leaves the equally lossless PNG files alone, it is ridiculous that the Lightroom team refuses to incorporate a PNG export plugin. Just one more ’RE: stop making garbage’ memo that I need to forward to the clowns at Adobe.

All of this to just come to my one-liner solution for Mac users… sips is the CLI/Darwin equivalent of the image conversion software that MacOS uses for conversion in Preview, etc. The manpage is available online, conveniently. But my use is very simple – make a bunch of supid TIFFs into PNGs.

for i in ./*.tif ; sips -s format png "$i" --out "${i/tif/png}" && rm "$i"

…is the basic line that I use on a directory full of TIFFs output from Lightroom. Note that this is zsh, and I’m not 100% positive that the variable substitution is valid bash. Lightroom seemingly outputs some gross TIFFs, and sips throws up an error for every file, but still exits 0, and spits out a valid PNG. sips does not do parallelism, so a better way to handle this may be (using semaphore):

for i in ./*.tif; sem -j +5 sips -s format png "$i" --out "${i/tif/png}"

…and then cleaning up the TIFFs afterward (rm ./*.tif). Either way. There’s probably a way to do both using flocks or some such, but I haven’t put much time into that race condition.

At the end of the day, there are plenty of image conversion packages out there (ImageMagick comes to mind), but if you’re on MacOS/Darwin… why not use the builtins if they function? And sips does, in a clean and simple way. While it certainly isn’t a portable solution, it’s worth knowing about for anyone who does image work on a Mac and feels comfortable in the CLI.


No escape

Assuming the leaked images of the new MacBook Pro are to be believed (and there seems to be no reason to think otherwise), tomorrow will bring MacBook Pros with a tiny touch strip display above the number row instead of a set of physical keys. It looks like a more practical version of the much-maligned Lenovo Carbon X1 concept. Yet, like the X1, it’s part of a bigger change that makes for an overall worse keyboard experience – in the case of the leaked MBP images, the physical keys themselves are moving to the slim-but-unloved keys from the MacBook.


Telephoto

As is to be expected whenever Apple announces something new, a lot of shit is being flung around in the tech sphere over the iPhone 7 and 7 Plus. One particularly fun nugget is that the secondary camera lens on the 7 Plus’s dual-camera system is not, despite what Apple says, a telephoto lens. This is based on a few mixed-up notions from people who know just enough about photography to think they know a lot: namely that ‘telephoto’ is synonymous with ‘long’, and that 56mm (135 equivalence, when will this die) is ‘normal’ (and therefore not ‘long’ ‘telephoto’). 50mm was standardized on the 135 format because Oskar Barnack said so, essentially. Different versions of the story say that the 50 was based on a known cine lens design, or that glass to make the 50 was readily available, or that it was necessary to fill the new large image circle, but whatever the original motivating factor was – the original Leica I set a new standard with the 135 film format, and a new standard somewhat-longer-than-normal focal length with its Elmar 50/3.5. The idea behind normalcy is matching our eyesight. This, conveniently, tends to match up with the length of the diagonal of the imaging plane; √(24²+36²)≅43mm. 50 is already noticeably longer than this, and 56 even more so. There’s a reason 55-60mm lenses were popular as more portrait-capable ‘normals’.