Precise imperfection in the Yahoo Weather app

Screen image: Yahoo Weather app showing precise numbers

The Yahoo Weather app has been widely, and deservedly, praised as an example of a beautifully conceived and executed native app. I agree with those assessments, and use it on my iPhone most days to try to understand how many facets of Canada’s weather I might expect to experience in the coming hours and days.

There’s one detail, though, that stands out as gratingly wrong for me. It’s such a small detail that to point it our seems petty, but in an app that otherwise is of such high quality this tiny detail stands out.

While I can configure the app to use metric measurements for temperatures and wind speeds, it uses a mix of imperial and metric in the text summary of the forecast. The imperial measurements are shown first, with a precisely calculated metric equivalent shown in parentheses. In the example shown here, that makes for an awkwardly presented result of “High around 35ºF (1.7ºC)”. Wind speed presentation is similarly awkward.

The word “around” shouldn’t be followed by such a precise measurement. Beyond that, the text summary should just show me the metric measurements if that’s what I’ve configured.

On an unrelated note, I’m looking forward to seeing some warmer temperatures showing up in the app…

Duelling “Cancel” buttons on my iPhone

I’ve recently noticed an example of duelling “Cancel” buttons in iOS 8 on my iPhone. Have a look the following screen captures to see how it plays out.

Screen 1 shows the Photos app on my iPhone, showing a photo of a steam locomotive. If I want to make some changes to the photo, I can tap the “Edit” button in the upper right corner to edit the photo.

Screen image: train

(Screen 1)

In Screen 2, I’ve decided to crop the image, which is easy to do.

Photo-editing screen on an iPhone

(Screen 2)

Maybe this photo doesn’t need to be cropped after all. In Screen 3 I can tap “Cancel” in the bottom left corner.

Photo-editing screen on an iPhone

(Screen 3)

The App designers don’t want me to inadvertently lose any work that I’ve done. In Screen 4 the app presents a confirmation, with “Discard Changes” and “Cancel” as the options. Why yes! I do want to cancel!

Photo-editing screen on an iPhone

(Screen 4)

Of course, the meaning “Cancel” has changed! Initially it meant “I don’t want to make these changes!”. Now it means “I do want to make these changes!” which is obviously a little confusing, as tapping cancel will leave me on the same editing screen that I actually want to leave.

A better pair of labels might be “Cancel, and discard changes” and “Continue editing”. There are probably plenty of others.

You don’t often see this kind of unclearly worded confirmation these days!

Muscle memory rules parts my world

I recently bought an iPhone 6 to replace my iPhone 4S.

(As an aside, my wife will now use that iPhone 4S, and we’ll finally retire my original iPhone 3G from daily use!)

I’ve enjoyed it so far, and the new iOS 8, but there are a couple of instances of muscle memory that I’m still working on overcoming with the new device.

First, Apple has moved the sleep button from the right side of the top edge to the upper right edge of the device. The move makes sense given the larger size of the iPhone 6, but after a few days of use I still automatically reach for the top edge to put the phone to sleep.

Second, the iPhone 6 includes Touch ID, which uses the home button to detect my fingerprint. Again, after a few days of use I still press the home button and then swipe to wake my iPhone, even though a button press plus a lingering finger on the home button will engage Touch ID and get me into the device.

At a more minor level, the larger screen size is taking a little getting used to. It’s not uncomfortable, but my hands haven’t yet adjusted to the jump in size from the iPhone 4S.

It’s kind of fun to notice this stuff as it happens.

Scrolling in the Apple TV UI

I’ve written before (here, here, and here) on the differences between scrolling on an Apple Macintosh and scrolling on an iOS device like an iPhone, and how those differences are going away.

It turns out that there remains at least one remote corner of the Apple universe where the reconciliation of gestural meaning is still a little awkward.

The Apple TV is a content delivery device that provides an elegant user experience for delivering content from a variety of sources to a television screen. It includes a wonderfully simple remote control that is generally a delight to use. One less than delightful aspect of the remote control, though, is the cumbersome method for entering text, such as when searching content. Happily, Apple provides an iOS app called Remote, which can be used to control the Apple TV and which enables easier text input using a keyboard.

Of course, the Remote app can control all aspects of the Apple TV, using a gestural UI that one would expect from iOS. It’s here that things get a little awkward.

Note the following models:

  • On a Mac (in OS X Lion), dragging two fingers on on the track pad moves the contents of a window (e.g., scrolling through a list)
  • On iOS, a swipe gesture moves the screen (e.g., scrolling through a list).
  • On Apple TV, clicking the arrows on the remote control moves the on-TV-screen selection indicator (e.g., selecting an item in a list).

Using a swipe gesture in the Apple TV Remote app, which in effect turns the iOS device into a trackpad when used this way, also repositions the on-TV-screen selection indicator. This is quite similar to the behaviour of the gesture on a Mac trackpad prior to Lion, where the gesture controlled a UI widget (scrollbar) rather than the content itself; it is the selection indicator that is being controlled by the gesture not the screen content. This makes sense for a point-and-click remote control, but not for a gestural one.

For me the awkwardness arises when scrolling through a long list, such as many rows of movies.

That is, when scrolling vertically the selection indicator stops moving in the middle of the list view port, and the list moves through the selection indicator. The experience for me feels strongly like the swipe gesture is moving the list in the opposite direction to the swipe. As a result, I use the regular remote control for navigating screens on Apple TV, one click at a time, and I use an iOS device for entering text when needed. This isn’t really optimal and I’m curious to see how the Apple evolves and improves the experience.

Travel game with Google Street View

During my recent Karos Health trip to Chicago for RSNA, I accidentally discovered a fun game to play using my iPhone and Google Street View. On this and previous trips I’ve emailed photos to my son to show him some of the sights that I see. Often the pictures are taken while traveling from one location to another.

On this Chicago trip, my son noted in an email that one of the pictures I sent was similar to one that I had sent him during last year’s trip. I responded that it was probably the exact same view, as I would have taken both pictures while riding a bus between our hotel and the conference site. The next day he sent me an email showing me that he had found a similar picture on Google Street View.

Buildings in downtown Chicago

(My Picture)

A Google view of Chicago

(Street View Picture)

And thus, a new (for us, anyway) game was born.

Every day for the remainder of the trip I sent him a new photo, and he found corresponding shots in Street View for all of them. At first, I included clues in my emails, but eventually stopped and just let him discover clues in the the photos themselves. He didn’t have any trouble.

A view of Chicago

(My Picture)

A view of Chicago

(Street View Picture)

This turned out to be something that makes a business trip more interesting for my family. We might even continue the game here in Waterloo.

It turns out that Apple is not perfect

I’ve been having trouble with my Apple TV, and decided to pay a visit to the Genius Bar in the recently opened Apple Store here in Waterloo to see if I could get some help. I had been unable to address the problem on my own, or via a bit of Web research, and I wanted to experience the Genius Bar in action.

My three boys and I went to the store on a weekday afternoon, and I was surprised at how busy it was. The place was packed with customers and employees. I made an appointment to consult someone at the Genius Bar, and my sons explored some of the products on display. They had a ball, and nobody tried to interfere with their fun.

Come appointment time, I explained the problem I was having to Darcy, the friendly and knowledgable genius who was helping me. After exploring some options and trying to reproduce the problem, Darcy eventually decided to replace my Apple TV (a solution that I discovered upon returning home turned out to have solved the problem). The latitude given to Apple Store employees appears to be about as wide-ranging as I had previously heard. It was a great experience.

So what makes Apple less than perfect?

After making my appointment, I received an email from Apple confirming the time. The email included a link to an Apple Store iPhone app. I decided to try installing it, and clicked the link in the email to do so. The App Store application launched on my iPhone, but instead of seeing the Apple Store app, I saw a message saying “Your request could not be completed.” I asked Darcy about it, and he told me that the Apple Store iPhone app isn’t available in Canada, and that he’s not sure why the link is included in emails for the Canadian stores.

It’s an extremely minor issue, of course, and it didn’t at all bother me that I couldn’t download the Apple Store app. It is, though, telling that such a minor thing stands out in a customer experience that is otherwise exemplary all around. It’s just not quite perfect.

Delivering overnight at Karos Health

Last Friday we had our second FedEx Day at Karos Health. Our first FedEx day was, to me, a prototype to see how the event would work at our company. That first try was a great success, and we resolved to do it again. While the details of our approach are slightly different than those at the company whose FedEx Days inspired us, Atlassian, our take on the concept is similar in spirit and purpose. (By the way, it’s called a FedEx Day, as you have to deliver something overnight.)

For our second event I had two goals in mind for my own activity. The first was to learn more about Ruby on Rails, a programming language and framework combination which I had begun investigating earlier in the week and with which I already had built an exceedingly simple application for creating and managing patient records. My second goal was to explore an approach to publishing documents as defined by IHE. I picked one of the simpler scenarios (use cases, in IHE parlance) to try to deliver on:

A patient in the emergency department has all her relevant available documents retrieved via 240 XDS transactions. As initial triage of the patient is done, an additional document regarding diagnostic results for this patient is registered in the XDS Document Registry. Currently, there is no way for the Emergency department to learn about the existence of this new information. With a publish/subscribe infrastructure, the initial query to the XDS Document Registry would be accompanied with a subscription request, as a result of which a notification would be sent to the 245 emergency department. The subscription will be terminated once the patient is no longer under the care of the emergency department’s institution.

— from “Unexpected Notification Use Case”, section 26.4.1 of IHE IT Infrastructure Technical Framework Supplement: Document Metadata Subscription (DSUB) (PDF)

Put another way, an emergency department physician has requested an imaging study, such as an MRI, for a patient. The requesting physician needs to see the results, provided as a document, as soon as they are made available by the radiologist who read at the study images. A notification alerts the physician that the result document is available.

Working with two of my Karos colleagues, I used Rails to put together a simple web app prototype focused on what a physician might see on a smartphone (an iPhone, for demo purposes, as that’s what I use every day) when receiving a notification that a document has been made available or updated. We used a simple script to push notifications into the prototype’s back end, which dutifully made them available to the different colleagues that we were demoing to. Each notification includes a link to the affected document, which can be viewed right away. Happily, the prototype worked well and I’m thoroughly enjoying Ruby on Rails so far.

It was fun to build and show the prototype, and fun to see all the other results that emerged from a day of directed play at Karos. We’re all looking forward to the next FedEx day at Karos.

Adding albums to the music mix

LPs by John Cage and Bruce Springsteen

I spent some time over the holidays updating the music on my iPhone. That’s something that I do periodically, as it has far less capacity than would be required to hold my music collection and I like to vary what I listen to. The sources for the tracks are varied. Some I download from iTunes and other sources. I often digitize music that I have on CD. Less often, I digitize music that I have on vinyl albums or 45s, and doing so recently got me thinking about mix tapes.

I’ve created them, in the distant past, and enjoyed the reverence for the form in Nick Hornby’s High Fidelity (in both novel and film versions). Still, it had been many years since I made a mix tape, and even a few years since my last round of vinyl digitization. In the intervening time I had filtered from my memory just how out tedious it can be to digitize more than a few tracks. I retained only a fuzzily idealized notion of savouring each track while it is transferred to digital form (or, as I did in days gone by, cassette tape). That notion holds for the first few tracks, but the novelty does wear off! Here’s a simplified version of the steps required to digitize a track:

  • Connect the turntable to the computer.
  • Pull the record from it’s sleeve and put it on the turntable.
  • Start the turntable and drop the needle on the track; set recording levels to be loud but not so loud that distortion is introduced during the loudest passages.
  • Once levels are set, drop the needle again, but before the piece starts.
  • Start the digitizing/recording.
  • Enjoy the track while it plays.
  • When the track has finished, stop the digitizing/recording.
  • Remove the record from the turntable and replace it in its sleeve.
  • Edit the digitized track to eliminate any silence at the start and end of the track.
  • Add track to iTunes and add meta data to taste (Track name, Artist, sleeve art, etc.)
  • Repeat as necessary.

Obviously there are workflow optimizations available (e.g., record a batch of tracks, then edit them, then add meta data), but it’s still a laborious process. It was even more so in the past when the target was a cassette tape and the process included selecting tracks to efficiently fill a fixed length tape, manually minimizing silence between tracks, and creating cover artwork by hand.

Anyway, in the end I realized that I don’t at all miss the tedium of creating mix tapes the old-fashioned way, or digitizing analog formats. I do, though, love listening to the iPhone-age equivalent of mix tapes.

A user’s experience shapes user experience

I had a conversation with someone a while back about user experience on mobile devices. In the course of it he asked me how much I use video on my iPhone. I responded that I used video occasionally but not much, which was accurate in that it characterized my behaviour but which also felt incomplete.

I’ve been thinking about it since then and observing my patterns of use. I think that I now better understand why I answered the way I did. That is, I now understand the why of my behaviour. It’s not that I’m uninterested in watching video. It’s that I’ve shaped my behaviour to match my experience of the capabilities of the device. Two big factors colour my experience.

The first is battery life. My iPhone 3G will, in normal use, last the whole day on a single overnight charge of the battery. I found early on, though, that watching even a few videos will cause the iPhone to run out of power before day’s end. For a few reasons normal use for me has changed for me over time. So, too, has my iPhone battery’s ability to hold a charge. As a result, many months ago I started to regularly use my mobile data plan for connectivity rather than WiFi — turning off WiFi extends battery life.

More recently, while reflecting upon the video question from that conversation, I (re)discovered that I’m more likely to watch a video if the day is mostly done and I have a good charge left on the battery. This behaviour was almost unconscious; I had to notice myself doing it a few times before realizing why it was happening.

The second factor is network performance. The value of the video needs to overcome the cost of waiting for it to download, which can vary dramatically on my mobile provider’s network. As well, even if the video starts playing quickly, the download is rarely fast enough to allow me to skip ahead conveniently, which I often like to do. In the end, while using my iPhone if I encounter a video that seems interesting I most often reserve it for later viewing on my laptop.

I’ve discovered many other subtle behaviours that distinguish my own use of an iPhone from my use of a laptop on a fast network, but thinking about mobile video is what got the exploratory ball rolling for me.

Like a truck hauling a load of scrap iron

As I wrote back in May, performance is an important part of a good user experience. The perception of performance is something that a designer can manage by providing appropriate feedback, such as a progress indicator or message. Sometimes, though, the actual performance needs to be managed rather than just the user’s perception of it.

I’ve been thinking about this over the last several weeks in the aftermath of updating my Apple iPhone 3G to iOS 4.

What had been a fine mobile device prior to the update became a sluggish annoyance after I installed iOS 4. In addition to noticeable overall slowness, my iPhone became prone to random screen freezes that lasted four or five seconds or more.

Even though Apple had disabled some of the features in the iOS 4 release when installed on an iPhone 3G model, there were still several remaining enhancements that I enjoyed. While appealing, though, none of them were enough, for me, to make up for the severe performance degradation that accompanied the release.

This past weekend I finally took the plunge and downgraded my iPhone to version 3.1.3 of the OS. As this is something that Apple appears not to want users to do, I used directions that I found in this article at Lifehacker, and they turned out to be clear enough for me to get the job done. The improvement in performance has been dramatic. Under iOS 4 my iPhone felt about as responsive as a flat-bed truck hauling a load of scrap iron. It’s now back to feeling like a small sports car, fleet and nimble and fun to use. I’m happy to sacrifice the iOS 4 features that I now no longer have for the snappy user experience that again defines my iPhone. And that says a lot about the importance of performance to user experience.