story
A couple of years later and I'm a convert. If I had to buy a new MBP today I would definitely make sure it has a touchbar. It turns out app makers have found some nifty uses for it, like the ability to quickly mute/unmute when in conference calls, quickly changing volume/brightness, stepping through code while debugging etc etc. These are seemingly insignificant quality of life improvements, but I definitely miss them when on a machine without a touchbar. Conversely I haven't missed the Fn-keys, not even once.
That hardware esc key looks tasty though, I hope they bring that to the 13" model.
I think what the touchbar brings is always active tutorial/reminder as to what's possible with the touchbar. Due to it's very nature you don't have to be taught what's on there, you can see, you then don't have to remember/rely on building up muscle memory.
I imagine it sucks for accessibility. I'd still want one before my faculties faulter, I imagine I'd get some use out of it with my IDE.
It does also provide some useful feedback as it can display things, too.
I buy what people say that it's way better for scrubbing video/audio if you do that a lot (lots of people previously bought special-purpose accessories for the same)... and mostly worse otherwise.
I was part of the spell correction team at Google, and we made sure that we aren't overly aggressive even though lots of people mistype their web query.
Nowdays I find it much harder to research rare things on Google, and I have to undo the automatic correction that the spell corrector does all the time (which is OK as long as it's easy to undo).
I'm more ok with the extra step, if it's my fault for misspelling.
Less so if I have to take an extra step to correct Google's inaccurate "correction."
I know at this point it's mostly marketing, but we are talking about a "pro" model.
In any case, it is very non-obvious that removing the function keys used by power user and non-power user alike (often for work) is necessary for the survival of the company.
It is also very non-obvious why the function keys had to be removed for a touchbar to be added to an already relatively expensive piece of hardware. The touch-bar/function keys are already non-reachable from the home row for most people; so what's another row? I think this was a design-oriented decision, not a regular-user-centric one.
Making scrubbers for audio/video (more) physical is nice, since scrubbing video with a mouse usually requires that you first move the mouse to wherever the video player puts its scrubber, and scrubbing with the keyboard usually requires multi-key combos and always has the “wrong” granularity. (It’s also helpful in that you can now combine this scrubbing in a gesture with mouse movement, e.g. picking up a clip from your library in iMovie, scrubbing through the timeline to scroll it to the right position, and then moving the mouse over to the timeline and dropping it. That’s basically an impossible gesture with mouse-movements alone; you’re left to hover the clip over the scroll-edge of the timeline and wait for it to accelerate its scrolling [and then usually overshoot].)
Come to think of it, Sublime Text and other IDEs with a minimap could display it (rotated to horizontal) on the touch-bar, and let you scrub on it, too. Do they?
The basic VLC two-finger scroll scrubbing works way better.
And as TouchBar is a per app custom display, you'd have to look down at your keyboard to see the current status... of anything.
Someone in a related thread said the app-specific keys are only available when the app has focus, so wouldn't you have the status of the app in front of you anyway?
In any case, I found the statement funny, because touch is flakey for me, so I never know what impact pressing a touch key will have.
Essentially a partial fusion of the two things would be physical keys with screens on top for dynamical renaming. but that would have its own problems.
So for me, the Touch Bar makes these contextual commands discoverable, and that’s worth more than the tactility of physical keys. Doesn’t matter if I can touch without looking, if I don’t know what the keys do. Besides, after a couple of years with the Touch Bar I feel like muscle memory seems to work about the same anyway, I never have to look for the mute button in Zoom for instance, I know where it is.
Thanks for challenging my comment, prompting me to (hopefully) clarify!
I do have touchbar envy! I tend not to look at my keyboard too much on desktop, but when on a laptop it's definitely in my peripheral vision and would hopefully encourage me to learn it's shortcuts. There's a lot of UX work involved to make it all work perfectly though. App's especially, shouldn't just use the touchbar, they should make it easier with visual in app reminders about what's down there. They should do this with the standard FN-keys too for us non-touchbar folk. I'm a big fan of having keyboard shortcuts shown on screen UI's.
There's no denying it can do more in than Fn-keys.
If the external keyboard had a TouchBar and I would start incorporating it into my workflow I can see it's use-fullness.
Which is honestly interesting; it seems almost like they’re suggesting that this could grow into a larger feature, where Sidecar is really a kind of “Remote Touch Bar.app”, and you could add larger Touch Bar controls to your app that are only visible through Sidecar. (So you could have OS-level support for e.g. DAWs to display their VSTs onto your iPad for direct manipulation, without needing their own iPad OS app.)
———
I should note, as an aside, that Sidecar lets you keyboard on the Mac host through an iPad’s attached keyboard-case, but doesn’t really treat finger-gestures done on the iPad screen in the Sidecar app as being equivalent to mouse touchpad gestures on the host.
I’m wondering if that’s a conscious design choice, and whether someone at Apple is thinking that the “new HCI paradigm” for desktops will involve still having an external Bluetooth trackpad, but no external Bluetooth keyboard, with that role instead being served by an iPad with a keyboard-case attached to it.
That’d kind of fit—it enables all five(!) interaction methodologies Apple currently has: mouse gestures, keyboard commands, touch inputs, pencil inputs, and touch-bar controls.
But, importantly, it does so while entirely avoiding “gorilla arm” (unlike the huge Microsoft Surface Studio), because your touch surface is small and on the table in front of you, rather than “being” the display. (For most Sidecar iPad gestures, even with a full-sized iPad Pro, you never really have to lift your arm off the table.)
In Apple’s envisioned desktop paradigm, touch is seemingly an input method that you get from a separate input device—one that happens to have a screen—rather than touch being just a “way to do” mousing.
I usually put my laptop in front of me with an external screen above it. That way I get used to the laptop keyboard and can feel right at home in a conference room or wherever.
Maybe I've just had a limited, poor experience with it, but from the apps I've used these functions are only available on the touchbar when the app is focused... which has the controls anyway.
I was using Skype for Business and it had the audio controls in the touch bar, and also on the screen about 1.5" inches above the touchbar. I wanted to mute and unmute when I had the app in the background, but I couldn't. Useless.
Most apps don't use it very well though - I actually quite dislike Safari's overcomplicated tab buttons - I can't make out anything so it's basically useless, and I use cmd left arrow to go back not the touchbar button.
I would love to have a hardware escape key. But despite the shitty butterfly keyboard issues I'm currently going through I want to keep this Macbook as long as possible.
Well, these have physical buttons pre-touchbar, so the touchbar is definitely slower since it requires two clicks (one to summon the slider) and you have to look at it instead of just feeling it.
Learn keyboard shortcuts, and no one needs a touch bar.
The touch bar works best for things accessed sufficiently infrequently that muscle memory won't be a thing.
I think the touchbar is a cool idea and I loved that Apple explored it innovated. But I also think that ultimately it's not a good idea and good design. Particularly for the power users.
Then again, I hardly ever use the touchbar either.
I tried to give this a good go. I remembered when I was the young kid thinking about all those old fogies who could not adapt to changing technologies, and now, I am the one yelling "get off the lawn!"
I can find out, sure, but my experience tells me I won’t, so for me the Touch bar is a win, simply from a discoverability standpoint.
But in all fairness, the F-keys on a mac never really worked like function keys, they've always been more of a row of OSX-keys. I can understand why Apple felt a need to innovate that space, but I am not impressed.
It's a solution looking for a problem. It's garbage.
i never type with capslock on, and find using it for escape helps my vim workflow
(Edit: In case anyone else is wondering the same things, the main answers seem to be (1) on manual typewriters, both the shift and capslock needed to physically lift the entire carriage, and so required more force and was for some users a two-finger operation; and (2) all the buttons on the sides of the letters simply fill space to the edges, if we shorten the capslock, we'd need to put something else there or we'd move the A.)
It is not working well: the very limited travel on the keys is not giving my fingers the feedback that I actually hit the "esc" key (capslock). It introduces errors.
I'd probably have to live with this laptop for a while though. I don't think the startup I work for is going to buy a new MBP for me anytime soon.
Being able to go to 32 GB will be nice though. Up until this 2019 MBP, I had seriously looked into switching back to a Linux laptop.
Nowadays that I am working with Elixir, multi-core is much more appreciated :-D