Getting in tune with the UX Group

The December meeting of the UX Group was a great event, despite the appalling weather. Much as with last month’s Ignite Waterloo event, the meeting showed that when people talk about a product design that they are passionate about, the results are always illuminating and engaging.

I thought I’d briefly share the products that I brought to the table.

A guitar capo in use

First was a guitar capo. A capo is a device for holding down the strings on a fretted musical instrument, like a guitar, in order to raise the pitch. There are several styles and designs, ranging from a simple bar with an elastic strap, to more complex inventions. I’ve owned several, with designs optimized for cost (the aforementioned elastic strap) and preservation of tuning (though at the usability cost of requiring very precise placement) amongst them. The capo that I showed is made by Kyser, and is optimized for fast, one-handed operation. The easy to grab handle makes fast changes a breeze, and it can be easily clamped to the headstock when not in use. Mine works quite well and I’m happy with the results.

A tuner attached to a mandolin headstock

Next up was a compact tuner. Musicians have long lived with the need to tune their instruments. While being able to do so by ear is a great skill to have, not everyone has the ear to do so reliably when first learning to play, and even those that have developed their ear may need to tune in a noisy environment. Electronic tuners have been around for decades now, and they’ve been a great aid for getting an instrument in tune. My first electronic tuner, which I acquired years ago for tuning my guitar which I and still have, is a large device and has a great analogue needle that shows how far off a note is from being in tune. It’s clumsy to use with an acoustic instrument, though, but it is accurate. The newer tuner that I brought to the event, made by Intelli, is optimized for ease of use with fretted instruments. It clamps onto the head stock of the instrument and detects notes through vibrations transmitted via this direct contact. It swivels to make the display visible, the display is very bright and easy to see, it works with both acoustic and electric instruments, fits all my guitars and my mandolins, and it is small enough to easily fit in an instrument’s case. It’s not perfectly accurate, but it’s great for my needs.

It was fun to share these objects with the group, and I enjoyed the conversations.

Pretty lights on the tree, I’m watching them shine…

Children admiring lights in Waterloo Park

I’m really more of a warm weather guy, but there are some fine outdoor events that are worth venturing out into the winter cold for here in Waterloo Region. One that my family enjoys is the Wonders of Winter, a festival of lights that graces Waterloo Park every year. This season’s version started November 28 and runs until January 3. On our visit on Sunday night there were many people, young and old, who there enjoying the installation. My sons and I had a great time; my 4-year-old, in particular, showed a lot of excitement over his first visit to the festival.

It’s a community-supported initiative, powered by volunteers and supported by many sponsors. Thanks folks!

Meanwhile, over at Kitchener’s Victoria Park, there’s Christmas Fantasy, a similar event that looks like it’s worth a visit too. I’m sure that we’ll get there soon.

(By the way, if the headline for this post seems familiar but you can’t quite place it, try to imagine Darlene Love singing it!)

Cloudy, with a chance of thoughts

Screen image: a ‘thought cloud’ about Jazz

We have something new at Primal Fusion this week. It’s another Primal Labs release, in this case an interaction prototype that enables you to build what we call a thought cloud to express your thinking on a topic.

We’ve released it into Primal Labs, rather than as a part of our main thought networking service, for a couple of reasons. First, we want to get feedback from our community of users on whether this a is a useful way to express your thoughts. Second, it’s not finished, and there are many things that we could do with it. Rather than take it in a particular direction we want that community feedback as quickly as possible to help us prioritize what we do.

We can certainly see using this in our main service, and we have other ideas about how to use it as well. For now, though, please try it out in Labs area and let us know what you think.

My November IgniteWaterloo talk

Ignite Waterloo has released videos of 16 talks from the November 25 first event on Vimeo. It’s great to be able to watch the talks again, as it really was a wonderful night. I’m somewhat relieved to discover that my talk, entitled Metaphor in product design: Are you sure that’s an album?, turned out okay. Note that it started life as a blog post here, but the video expands on the post a little and is more fun!

December UX Group event

The next UX Group meeting, on Tuesday December 15, aims for equal parts socializing and design discussion. It’s the second annual December Product Potluck. Don’t bring food, but do bring a product or products whose design you find worthy of discussion. Anything is fair game, as the goal is lively discussion about design. Last year’s version was lightly attended due to winter weather, but let’s not be deterred by that! Moreover, we had a great discussion during the November meeting and it would be wonderful to continue the conversation this month.

Note that for this month we’re holding the event at McMullan’s on King, a fine pub in Uptown Waterloo. I know from StartupDrinksWaterloo that the venue is well-suited to an event like this. We have a fine UX community here in Waterloo Region, and it’s worth the effort to get out and meet. I hope to see you there.

Sipping from the data hose

I’ve been wondering what it means to have access to vast amounts of data and information. In particular, I’ve been thinking about the implications, from a user experience perspective, when users assume that data is accurate and synchronized.

Google Maps recently added Street View coverage for Waterloo, Ontario, where I live. As many other people did, I spent some time exploring my city, and there were some interesting revelations. For example, I noticed that pictures of my own house probably came from two different days, based on stuff visible in our yard. Moreover, I was able to pin down one of the days to about three specific dates last spring, based on the apparent weather and on the presence of a car belonging to my brother, who visited from out of town. Fun discoveries!

I also noticed that there’s a mismatch between the street view imagery and the aerial/satellite photo imagery. I’m sure that many other people have noticed this before in other cities, and that it’s not particularly exciting news, but sometimes an issue needs to hit close to home (figuratively and literally) to get my attention.

Screen image: Google Streetview in Waterloo

Here’s a simple example. On King Street, there was some new development work done several years ago. The aerial/satellite imagery in Google Maps shows work in progress. Street View imagery shows completed buildings.

(See this example on Google Maps, though depending on when you access this link the imagery for the aerial view, the street view, or both may have been updated. The image shown on the right preserves the mismatch that I’m writing about.)

A mismatch like this is pretty easy to spot. It’s much bigger than one I alluded to regarding my house, which really only I might notice. What does it mean, though, when a business on a street view image closes and is replaced by another? What does it mean when users add their own photos? How does the addition of historical imagery (in Google Earth at this point) contribute to the mix? Does the fact that the Street View images are taken at different times matter at all?

In short, as more and more data is added to Google Maps, how do such data synchronization issues affect the user experience? I know that I find myself making implicit assumptions about the underlying data (for example, that the it is relatively synchronized chronologically), in part because I find the experience so immersive.

I’m sure that this isn’t an issue specific to Google Maps by any stretch. It’s just visible there, which got me thinking about what it might mean; I’m not yet sure what all the implications are!

VeloCity, Ignite, and StartupDrinks

A crowd of people enjoy an event at VeloCity

Yesterday was pretty busy for me outside the office. I started with a visit to the Student Life Centre at the University of Waterloo with my Primal Fusion colleague Tom Levesque, where we checked out VeloCity Start Up Day. There was quite a crowd perusing the project displays and talking with the students about their projects. I was only able to speak with a couple of students myself, as there were many others trying to do the same, but I was impressed with the turnout and interest by both the participants and the visitors.

Next up was a lunch meeting over sushi with a few of the folks who put together Ignite Wateroo. We talked about what went right (a great deal) and wrong (very little) with our first event last week. It looks like we’ll do the next event in March of next year, and we’re already looking for speaker suggestions. We don’t anticipate changing much in what turned out to be a successful approach to the event. Having said that, though, we’re curious about what others think. Let us know.

Finally, I attended the third edition of StartupDrinksWaterloo. There had to be 30 people in attendance, and the conversations were a real treat. I met a few new people as well as seeing familiar faces, and I continue to marvel at the engaged and interested people who come out to events like this. I feel like a pretty committed attendee at this point, and I’m looking forward to the next edition in January. Congratulations and thanks to Dan Silivestru for initiating the Waterloo version of this event.

Instant websites! Just add water!

Screen image: Prepared Piano as understood by Primal Fusion

I recently wrote about a demo that a colleague and I gave at StartupCampWaterloo, showing a prototype that we had created at Primal Fusion. Yesterday, our founder Peter Sweeney showed this prototype at NextMedia in Toronto.

We’ve released that prototype into a new area at the Primal Fusion site, Primal Labs. It’s still not a product yet, and it’s still pretty rough around the edges, but it does show some of the promise of thought networking.

Our Automatic Website Generator does exactly what the name suggests. It takes a user-supplied topic, builds related thoughts using our Primal Fusion platform, and then searches the web for related content. The results are then almost instantly presented as a website on that topic. It works reasonably well, though the fact that we’ve put it in Primal Labs should make it clear that we have more work to do. Still, it’s a fine start and we wanted to share it with the world and get feedback from people. Do try it out — you’ll need a Primal Fusion account at this point, but signing up is easy and it shouldn’t be long before you can try both the Automatic Website Generator and our original product as well.