All the rumors pointed to new M1X MacBook Pros being released today at WWDC, but it did not come to pass. Perhaps the global chop shortage is to blame. Or maybe it’s the constrained availability of the mini LED displays expected to be used in the 14″ version. Whatever the reason, we will have to keep waiting for new professional M1 powered Mac laptops.
Hopefully Apple drops these before July. I have a kid heading off to college in August, and I was hoping to send her with my current work machine, the M1 MacBook Air, while I moved up to the 14″ MacBook Pro. But I’m kind of torn about it, as the M1 MacBook Air I have been using as my main work computer is working quite well now that Docker and Homebrew are M1 native. There have been a few quirks to work through (mostly related to NodeJS and the lack of an M1 native version for any Node version earlier than 15), but overall, it’s been a lightweight very capable dev machine. If they offered a version with 32GB of RAM and a few more USB-C ports, I’d probably not even need the 14″ MacBook Pro.
If you are using one of the new M1 Apple Silicon Macs, you may be wondering how to tell if an app you have is optimized for Apple Silicon. There are a couple of ways to deduce this.
Get Info: The first option is to use the “Get Info” option in the Finder. Navigate to the app’s location (usually in /Applications), click on the app, and the use Command-I. Under the “General” heading will be listing for “Kind:”, with three possible options – Application(Universal), Application(Intel), Application (Apple Silicon). The Intel and Apple Silicon options should be self explanatory. The “Universal” option means that the app is a “fat binary”, containing the code for both Intel and Apple Silicon versions. “Fat Binaries” have more coverage than an app compiled for either architectures, but they are also nearly twice the size of a standard binary.
Activity Monitor: If the app is already running, you can open the Activity Monitor (located in /Applications/Utilities). Here you will see a list of all running applications. In the column labeled ‘Architecture’, you will see either Intel or Apple Silicon listed. Activity Monitor shows you the code that is being executed, so even if the application is a fat binary, it will only show the platform code that is currently being run.
A few other comparative benchmarks on the new MacBook Air M1 (16GB/512GB configuration), pitted against a MacBook Pro 16” (i9/64GB/4TB/Radeon 5500M-8GB).
Blender (running in Rosetta 2 on the M1). Demo files can be found here.
Fishy Cat (1 frame):
MacBook Air M1: 1 min 35 sec
MacBook Pro i9: 37 sec
Mr. Elephant (1 frame):
MacBook Air M1: 2 min 18 sec
MacBook Pro i9: 1 min 13 sec
Racing Car (1 frame):
MacBook Air M1: 13 min 22 sec
MacBook Pro i9: 8 min 52 sec
Now, of course this is hardly a fair fight. The i9 MacBook Pro has a discrete GPU (in this case, a Radeon Pro 5500M with 8GB). And Blender is being run via Rosetta. But in the wake of the ridiculous walloping all the Intel Macs are receiving by these entry level M1 machines, I thought it’d be nice to share an area where the Intel Macs are still (at least for the moment) worth their money.
Ok, I’ll admit my previous post about the release of Apple Silicon powered Macs gave the impression that there wasn’t much exciting about the CPU switch, but boy, was I wrong.
I’ve had my hands on the M1 powered MacBook Air (with 16GB RAM/512GB SSD) for just a few hours now, and after putting it thru it’s paces, I’m floored.
I’ve run two benchmarks – XcodeBenchmark and Bruce X. These are more akin to real world benchmarks. I’m not knocking Geekbench of Cinebench, but these benchmarks involve actual applications that people will use, and actual projects that simulate what real world performance will look like. And after running each of these, and comparing it against my $4400 MacBook Pro 16″ i9/64GB RAM/4TB SSD machine, I’m floored.
Bruce X Benchmark
MacBook Pro i9 2.4ghz/64GB/4TB SSD: 16.03 seconds
MacBook Air (M1) – 16GB/512GB SSD: 11.69 seconds
MacBook Pro i9 2.4ghz/64GB/4TB SSD: 223.016 seconds
MacBook Air (M1) – 16GB/512GB SSD: 127.713 seconds
Again, these are real world benchmarks using real projects for Xcode and Final Cut Pro. And the MacBook Air doesn’t even have a fan. During the Xcode benchmark, the MacBook Pro’s fans spun up and were quite loud. The MacBook Air was dead silent. Of course, during sustained CPU usage the fan will be a benefit, as it will keep the CPU cooler, whereas with the MacBook Air M1, the CPU will throttle down to prevent the computer from overheating.
Here’s another comparison: I have a Logic Pro X project I’m working on that contains about a twenty tracks – 16 of which are audio tracks, the other 4 are software instruments, and of course, there are various effects applied to all the tracks. This project couldn’t play smoothly without stuttering on my MacBook Air 2020 i5 16/512GB machine. On the M1 MacBook Air, it plays as smooth as butter.
App performance isn’t the whole story though. The entire OS feels much faster. Apps (those that are Apple Silicon enabled) open incredibly fast. I’ve only tried a few apps that weren’t optimized for Apple Silicon, and the results have been great. For example, the original Pixelmator, which I still use for day to day graphics tasks, isn’t optimized for Apple Silicon. It opens quickly, and using it feels just as fast and performant as it does on my i9 MacBook Pro. Every app is different, of course, but Rosetta 2 looks like another incredible feat of engineering from Apple.
On the software development front outside of Xcode, things are a different story. There isn’t much of my everyday work toolchain that is ready for Apple Silicon yet. Stuff like HomeBrew, NodeJS, Docker and other web technologies are not quite ready for Apple’s new chip, and anyone who works with these technologies would be advised to wait before upgrading to the new machines.
But for everyone else – come on in, the water is fine.
The short of this is: If you are using your Mac for Xcode, Final Cut Pro or Logic Pro X, you are going to be blown away at what these Macs with the M1 chip are capable of.
Amazon is currently dropping $50 off the price of the M1 powered MacBook Air and Pro.
Some quick takeaways from the Apple event today that heralded the release of the first Apple Silicon Macs.
Apple is still including 720p cameras in the Air and the 13″ Pro. In the age of COVID where everyone is doing virtual meetings, this is disappointing. I’m not entirely faulting Apple here, as a 1080p camera that can fit in a thin laptop LCD doesn’t exist yet (as far as I can tell, I haven’t seen one in a laptop display). Apple is claiming that the M1 chip can improve the quality of the picture in sharpness and shadows.
Apple is using the M1 chip in the Air, 13″ Pro, and the mini. The low end Air is using a 7-core GPU vs. an 8-core GPU in the higher end air. But apart from that, there’s no differentiation given between the models. This falls in line with Apple trying to keep the specs of it’s hardware as vague as possible. But it makes you wonder if the performance of the higher end Air and the 13″ Pro are going to be comparable. The Air doesn’t have a fan, so it will be constrained thermally compared with the Pro, but for workloads that are not sustained, it should be just as powerful as the Pro. We will see once the real world benchmarks start appearing.
All three of these machines max out at 16GB of RAM. The RAM is built in to the CPU itself, which should mean RAM throughput should be quite fast. But it also means no user upgradeable RAM (which allowed people buying a Mac mini to save a good bit of money handling the RAM upgrade themselves). For the Air, topping out at 16GB is fine. For the 13″ Pro, it’s acceptable – this is the low end 13″ Pro, after all, the one with only 2 Thunderbolt ports… the real 13″ Pro (with 4 ports and better RAM upgrade options should come later). For the Mac mini, it’s a letdown. You can configure the last Intel Mac mini with up to 64GB RAM. 16GB just isn’t enough to handle things like opening 100GB Photoshop files, or opening Final Cut Pro projects that are several gigabytes in size. Now, maybe Apple has optimized the architecture of these news to more efficiently page out to the SSD, but for real pros, there is no substitute for having ample RAM. I’m anxiously awaiting to see how well these RAM constrained Macs can perform with large files. We will see.
None of these Macs can utilize an external GPU. That’s a bummer.
The Mac mini is once again available in Silver, and not Space Gray like the last version. I suspect this might be because Apple is going to have a ‘Mac mini Pro’ available at some point in the future, which would (theoretically) have higher RAM options.
With Apple utilizing RAM on the chip, they have buyers over a barrel. Apple can charge whatever they want for a RAM upgrade (and at present they are charing a $200 differential between 8 and 16GB). This isn’t going to sit well with many users (myself included). This makes me very nervous for an Apple Silicon Mac Pro or iMac. These machines are geared towards people who usually demand gobs of RAM, and don’t want to pay Apple exorbitant fees. If these machines don’t have user upgradeable RAM, there’s going to a lot of disgruntled Pro users.
All three M1 Macs can drive Apple’s 6K display.
The M1 Mac mini features two Thunderbolt 3/USB-4 ports, and 2 USB-A ports. This is a downgrade from the last Intel Mac mini’s 4 Thunderbolt and 2 USB-3 ports. No Thunderbolt 4.
All the new Macs feature Wifi 6.
I’m surprised the Touch Bar survives on the 13″ Pro. I was convinced Apple would drop it. But it seems it does serve a purpose in differentiating the Air from the 13″ Pro. Even with powering the Touch Bar, the 13″ Pro gets significantly better battery life than the Air. A slightly chunkier chassis goes a long way towards more battery capacity.
Apple Silicon looks like it’s going to be extremely competitive with Intel chips on the low end. Hopefully they can outshine Intel on the high end end as well.
WWDC has come and gone, and the rumors were largely true: Apple will begin a migration to its own CPUs (which Apple is currently just calling ‘Apple Silicon‘) sometime later this year. This move had been rumored for a few years now, so it came as a surprise to absolutely no one. Apple has long strived to control the entire widget, and with this move, it will remove Intel from the product matrix, giving Apple near total control over it’s Mac’s technological composition.
Now, Apple didn’t say anything about future Macs, apart from telling us they will use Apple Silicon. But if you watched enough technical presentations from WWDC, and paid attention to some of the details, there are some pretty obvious tells.
One of the biggest tells was Apple declaring in their keynote video that future Apple Silicon based Macs will be able to run iPhone and iPad apps directly. These apps will be available in the Mac App Store automatically, unless Developers check a box that will restrict them from appearing there. Now, many iPad apps have been gaining mouse/trackpad input support, so these apps running on an Apple Silicon Mac will probably perform as good on a Mac as they will on an iPad. But what about iPhone apps? These apps generally do not have input device support beyond touch. How will these apps function on a Mac?
Apple has been saying for a decade that adding touch input to a Mac was a bad idea. But during that same time, Apple has brought mouse/trackpad/keyboard/pen input to the iPad, something it said was best controlled with touch. So all the while Apple has been claiming that the Mac with a touchscreen would be a terrible compromise, they were bringing the Mac-ness of using a keyboard/mouse/pen to the iPad. So it clearly thinks that an iPad with expanded input support beyond touch is now a good idea.
If you’ve used the macOS Big Sur beta, you’ve no doubt noticed some of the big changes to the user interface/experience. Much of the UI looks more like iPad OS now. Apple is pushing for the Macs icons to use the same ‘squircle’ shape that the iPad/iOS use. It’s made the menu bar’s top items more spaced out, as if to allow for a larger touch target. They’ve applied the same spacing to the menu bar icons as well. The modal save/don’t save/cancel dialogs now feature larger buttons, as if to accommodate fingers instead of mouse cursors.
So basically, Apple has made numerous UI enhancements that all seem to drive towards one goal – better input with something as imprecise as a finger.
Now, maybe we won’t get touch support when the first Macs with Apple Silicon ship later this year. Nobody knows which Macs will go first, but if it is a Mac with a built in screen, there’s a strong likelihood that it will feature a touchscreen.
There’s one other observation I’d like to make. Just a few years ago, Craig Federighi said during a keynote that Apple was not merging iOS and macOS. Well, they may both remain distinct OSes particular to their own hardware, but Apple has definitely gone just about as far as merging them as you can go without actually merging them.
iPadOS 14 and macOS Big Sur share a common design language.
With Apple Silicon, you can now develop for both platforms in one app codebase, using SwiftUI.
You can easily bring your UIKit iPad app over to the Mac, thanks to the work of project Catalyst.
And if the above two scenarios don’t work for your iPad app scenario, you can just run the iPad app as-is on your Apple Silicon based Mac.
Now, there are still some major distinctions between the two platforms. For instance, the iPad still lacks a window manager. It is reliant on the App Store for installing new apps. The iPad security model is more restricted than the Macs. It doesn’t have legacy hardware support for the devices like the Mac does. There is still a feature chasm between the two platforms. But that chasm has grown considerably smaller. It’s small enough now that comparing the high end of the iPad (the iPad Pro), and the low end of the Mac (Macbook Air), can make choosing one of these devices for a particular task difficult. Once the Macbook Air/Pro can run all of the same software that you can run on an iPad, along with software that’s not restricted to the App Store, the chasm will be even smaller between the two platforms.
That sure does sound like the two products have merged to me.
Oh, and one final thought. The TouchBar is a goner. There was no mention of any improvements to the TouchBar API during WWDC. With touchscreen capability coming to future screen enabled Macs, the TouchBar becomes relegated to the long list of Apple technologies that never quite made sense.
Now, if we could just get Apple to add the AppleTV remote to that list.
More and more over the years, I find myself using Linux for day to day development. And why not? Most of the tools I use are available for Linux, and Linux is free and totally customizable. But can a developer who has long used macOS for his day to day tasks cut it on a Linux system? Let’s find out.
First, let’s cover what I use in my day to day work. Being a web developer, the tools I use daily consist of:
Google Meet and the G-Suite for all of the company daily meetings and document and file sharing. We make extensive use of Google’s web apps – Docs and Sheets, primarily.
My employers online tool is the Atlassian suite of products – Jira, Confluence and Stash. Being web based, using these on Linux isn’t an issue.
Various assortment of command line utilities including Docker, PHP, Ruby, NodeJS, Python, MySQL and MongoDB.
Chrome/Chromium, Firefox for browsing.
So every bit of software I use daily is available for Linux, and generally works as well as it does on a Mac. Visual Studio Code and Slack are both Electron apps, and while they work well on Linux, there’s a couple of caveats I’ll get to later in the article which you will want to be mindful of.
I first gave this workflow a test run using an older i7/6700k based desktop machine I’ve had for awhile. It was plenty powerful for what I run, and felt confident that I didn’t need a machine with a dedicated GPU, and could instead get by with something small and power efficient, so long as it had sufficient RAM (32Gb) and decent CPU performance. The Intel 10th generation i5 NUC fit that bill. I opted for 32GB of RAM (2 x 16GB SO DIMMS). I could have gone whole hog with 64GB of RAM, but I think for my needs, 32GB is plenty of headroom. I went with a Crucial P1 1TB SSD. It’s not the fastest PCI NVMe SSD in this class, but it was much cheaper than the Samsung EVO, and offered plenty of performance for a coding workstation.
I already had a couple of LG 4K Displays I was intending to use for this setup, but, as I’ll explain later, I had to use a different solution with this system.
Setup of the hardware is simple. You remove four screws on the bottom of the NUC, and pull of the bottom. You have 2 DIMM slots in which to install the RAM, and the NVMe slot for the SSD. I opted for the tall version of the NUC chassis, so I could also install a 2.5″ SSD later on if I so desired.
After installation, I powered up and installed Ubuntu Linux 20.04. Ubuntu has long been my distro of choice, and the 20.04 release is one of the best releases of this OS ever. Upon booting up, I checked that the Wifi and graphics worked OK, and then opted to do minimal install. This is one of the things I’ve come to appreciate with Ubuntu in the last few releases. I don’t need or want all the bloatware of an office suite and a dozen other tools I’ll never use. The minimal install option allows me to get the basic system and a browser running quickly, at which point I can install only the things I’ll need.
This setup was not without it’s issues. First, was my monitor. I have an LG 27″ 4K display I intended to use with this machine. Running at 100% or 200% scaling works fine. Ubuntu/Gnome has had fractional scaling for a few releases, but it’s far from perfect. Unfortunately, the area it suffers the most is with Electron apps, of which both Visual Studio Code and Slack are. Both will randomly switch to 100% scaling, which means their interfaces get really small. This can be fixed with a restart of the app, but it happens frequently, so it’s annoying.
The other issue with the 4K display happened when trying to play back videos on YouTube in 4K. They weren’t super choppy, but they weren’t super smooth either. The Intel UHD integrated graphics in the Intel NUC10I5FNH1 seem to struggle in Linux beyond a 1080p display resolution. In addition to the video issues, the UI in GNOME wasn’t smooth either. Ditching the 4K panel and instead using an older Dell 24″ 1080p display returned video and UI responsiveness to acceptable levels.
So now things are setup and running smoothly. I’ve passed my first day of using this system for work, and I’m pretty happy with it. I’m not surprised that both VS Code and Slack run so well on Linux. I had tried both of these apps on Linux prior to going down this route, since they are the core of my toolset. And of course, all the command line tools I use work just as well on Linux as they do on MacOS, even if sometimes (Docker) they are a bit more involved to install and setup.
The only area that’s currently giving me a bit of grief is with the VPN. My company uses Pulse Secure to connect to our VPN, and while there is a Linux Pulse VPN client, it relies on several package dependencies that are abandoned and have not been updated for Ubuntu 20.04. I’m currently using OpenConnect in it’s place, and while it does work, it’s not nearly been as stable and as consistent as Pulse VPN was on my Mac. I’ll post an update once I get this bit ironed out.
So far, the biggest inconvenience I’m encountering is my lack of 1Password for Linux. I use it on all my iOS devices and Macs, and have everything synced thru iCloud. Those aren’t options on Linux unfortunately. I’ve used Enpass before, and it’s an OK cross platform solution, but since I’ve ditched Dropbox for iCloud, there’s no real way to keep the Linux machines and the iOS devices in-sync.
Are you using Linux for your day to day work? What challenges are you experiencing in your workflow? Have you felt the pros outweigh the cons? Let me know down below in the comments.
Since 2011, I’ve worked either primarily (50% or more) or entirely (100%) from home. Since 2017 I’ve been working from home 100%, and honestly, the thought of ever going back to work in an office scares the crap out of me. Don’t get me wrong, I do miss the in person human to human interaction, but living in Atlanta, I dread the loss of 10-15 hours of my life a week spent in a car sitting in traffic. I’m much more productive working from home, and having done so for the better part of the last decade, I feel pretty confident to dispense some tips on how to do so.
The most important ingredient to successfully working at home is to have a workspace that is free from distractions. If you have a home office, you’re off to a great start. If you don’t have a dedicated home office, try to find a room in your house where you can setup a temporary one. Key ingredients are a door to close off, because you will need silence and to keep distractions from house mates to a minimum.
If you are using a room like your bedroom as your office, you will probably find that after awhile, the monotony of spending 8 hours or so sleeping in a room and another 8 hours working in the same room can become too much. In the early 2000s when I worked from home periodically, I had my work desk in my bedroom and it got to be too much. It’s fine for a short term solution, but the mental toll it can take on you might be too much.
The other key to surviving in your workspace is to make sure you get out of it periodically. Take time to get out of your workspace for lunch. Try to get outside the house even if it’s only for 15 minutes or so a few times during the day. When you are physically at work, you will get up during the course of the day for mental breaks. Just because you are at home doesn’t mean that you still don’t need those breaks.
Proper tools for the job
Having the right tools at your disposal is equally as important as the space you will inhabit during this time. Here’s a few recommendations.
If you don’t already have a desk, I’d suggest getting one that can easily adjust to being a standing desk. The one I’d recommend is the Autonomous Smart Desk 2. Available in a wide range of desk top materials and bottom leg colors, you can get a 53″ x 29″ top with fully automatic legs for $429 (a $30 discount is available if you sign up for their mailing list). Also available on Amazon.
If you are looking for a budget option, you can piece together a table top and legs from Ikea for under $200. Ikea has a broad selection of table tops in different finishes, and a wide selection of legs to match.
I’ve been sitting on my keister professionally for 30 years, and I’ve had employers provide great chairs and not so great chairs. I can say with 100% certainty that the most essential component of working from home (if you choose to sit over standing) is a good chair. I’ve found no other chair that my ass and back appreciate as much as the Herman Miller Aeron Chair. Everything else is settling.
Working from a laptop is great for the freedom to pick up and move from room to room, but if you are staying in the same place for extended periods, a larger external screen is a great productivity enhancer. Because monitors are not a one-size-fits-all proposition, here’s a few options.
Best 1080p Option
If you aren’t looking for anything fancy, the Dell P Series 21.5″ Screen LED-Lit Monitor Black (P2219H) is a great 1080p option. It’s a bit pricey (just under $200) for a 1080p display, but has a great refresh rate and viewing angles, and is an IPS display. It also offers the ability to rotate 180 degrees, so if you are a coder using it as an external display for your code, you’ll get more of it in view.
Budget 4K Option
If 1080p isn’t going to cut it for you, and you want something larger, 27″ is the sweet spot for 4K displays. Of all the 27″ LED displays available, I’ve found that LG makes the best offering dollar for dollar. The LG 27UL650-W 27 Inch 4K UHD LED is a great mid range offering. sRGB with 99% color gamut makes it a great choice for designers. The rotating stand makes it a great choice for coders who want to use it positioned vertically. AMD Freesync makes it a great choice for gamers with Radeon graphics.
Best Overall 4K Option
If you can afford the splurge, the LG 27″ Ultrafine 5K is still the best overall 27″ display available (at about $1300), especially if you are Mac user. You’ll get a 5120 x 2880 resolution display in an unassuming black chassis. The real benefits of this display come for those connecting laptops with USB-C. This display offers USB-C connections and Thunderbolt 3 (94w) charging. Additionally, you get a built-in webcam, which is nice when working from home these days. On the downside, you don’t get the ability to rotate the display 180 degrees, and there’s no VESA mount capability. This monitor is also a bit long in the tooth, and hasn’t been updated in years. Still, if you are a professional working from home, this is still the best 27″ 4K+ display available right now.
If your desktop doesn’t have a webcam, or if your laptop does but features one of the weak-as-hell 720p webcams Apple still insists on putting in their ‘Pro’ laptops, adding an external webcam can be a good option to make sure your co-workers are seeing you at your absolute best.
Logitech Pro Webcam C920
A 1080p webcam that includes stereo microphones and has wide ranging support on Windows, MacOS and Linux. Generally this retails for about $80, but due to the Covid-19 pandemic, you may find it hard to get your hands on one at it’s normal retail price.
I see that the 2019 fold craze isn’t dead yet. This doesn’t look any more promising (or sturdy) than the Galaxy Fold.
I do think that as a concept, a foldable device is something that could take off in the industry. However, as Samsung has shown, execution matters. None of the players in the foldable arena have gotten recipe right yet. I’m not saying it’s not possible, I’m just saying one year in, nobody who is shipping (or close to shipping) a foldable product has produced anything better than a lab quality prototype.
Will Apple jump in to this segment? Possibly. But if they do, I hope they hold their fire until they have a foldable screen that is as reliable and sturdy as a non foldable one. If it isn’t, then what’s the point?