bureau

It means “desk” or “work table” in french : how metaphoric for a blog discussing what my work's about. Get the RSS feed.

i

I'm Thibaut Sailly, an independant interface designer based in Paris. Say hello on twitter or by email at bonjour ✉ tsailly ◦ net.

Topics

Archives

© 2010-2014 Thibaut Sailly · Powered by Movable Type · RSS

Hololens

A few weeks ago I attended an event at Microsoft Paris, NUIDay 2016. It's a mix of demo sessions, panels and most importantly hands-on time. Alongside trying on a pair of Hololens for a few minutes, I got to chat with developers who had been doing some work for it in recent months. Here are some remarks from the notes I took.

The object itself

What you see

What you do

Content delivery

Potential of AR

Here's a quick list of situations where AR will help and where VR would be inferior:

To finish these notes, here are some topics I feel will need to be addressed before AR can reach a large public and scale to an economically viable set of products.

Apple Watch untouched

Here are some observations about the Apple Watch from what we can see to this day : a HIG document, a marketing site, a presentation event with a 12 minutes demo, and the SDK.

Physical size

Although this device is unaccessible for now, and has a screen size we’re not used to, it is still essential to get a rough idea of its real size before committing to a specific design (if you want to build an app ready at launch time).
Based on Apple pictures and the watches cases heights (38mm and 42mm), I’ve approximated the two screens dimensions and drawn outlines of both objects at scale 1:1.
You can download the PDF* and print it out to see for yourself. You also can use it to build a fake watch helping you in your information architecture work.

I’ve also resized some of the watches gorgeous pictures so that they’re at scale 1:1 when displayed full screen on an iPhone (5, 6 and 6+). They come very handy if you use a certain mirroring tool to work on your designs - which I couldn’t recommend enough.

You should also watch carefully the presentation keynote from September, screenshot the hell out of it, take notes and learn from what Apple has decided was appropriate in their own apps: number of elements per screen, their respective size, the space left between them, the font sizes, use of color and animations, etc.

While the screen is tiny and the interactions very limited, the graphic details play a major part of the app personality and quality. This work should not be overlooked. Have a look at the way the arrow are animated in the Activity app for example.

Also worth keeping in mind: because of the way it’s held, the reading distance on the watch will be shorter than on a smartphone.

Ellipsis galore.

Ken, Stacey, Jane, Evans Hankey, Ben, Alex, Luke, Dana, Jon Dascola, Dan Keen, Lance Wilson, Jamie, Eliza, Chance Graham.
These people have 2 points in common: they all have very short names, and they’re all present on the watch marketing pages. Coincidence? Maybe not.

Nomenclature.

Glances are the watch’s notification center, and notifications are the push notifications we already know, only much smarter. Although these features are very similar in their nature, using this new terminology isn’t all marketing. It’s helping thinking and communication when designing, avoiding mixups between what an app does on the phone and what it does on the watch. Use them early on.

Emotion and personality.

In the footnotes you’ll find very raw notes taken from the september event video. Personal is was a very very important word throughout the presentation, as much as the focus on emotions. Not only in the new communicating options (tap, sketch and heartbeat) the watch proposes, but also in the UI itself: it might be a device where you want to be pleased, indulged even, by what you see as much as you want it to be a great tool. Emotions have always been part of Apple products, but it seems this one is moving the cursor way up. I wouldn’t be surprised with motion design taking a much bigger role on the watch than it does on the phone, because it’ll bring much more emotional value.

The personal relation we have with this object is also why Apple chose to communicate about it through vectors of fashion and taste. For one thing, watches have always been present in fashion magazines, so it makes sense. But it’s also a way to place the Watch in a broadcasting arena where all the other competitors have absolutely no ear to be heard from. Vanity fair and watches intersect pretty well. Vanity Fair and LG or Samsung, not so much. This type of media has a reach tech publications can't dream of, so it’s a way for Apple to shut their competitors off from the debate “should you have a smart watch or not, and if so, which one”.

Pitch dark.

In much of the animations like the fitness app badges, it’s as if the watch is a little dark room where the light only comes from the window to our world that is its screen. Floating objects coming up to you, out of darkness. There is a distinction between what lives on the surface of the screen, and what lives beneath it.

Taptics

It seems we’re about to see the revival of morse code communication. Joke aside, I wouldn’t be surprised to see messaging apps allowing you to draw emojis from tap sequences and force touch.

Personally very happy to see the “taptic engine” feature happen. 10 years ago I did a connected watch personal project and had this idea of a “hey, you, tap tap tap” signal from the watch when receiving a call or a text message, so it’s a good feeling to see it existing at last (and in a much more elegant incarnation too).




As a final comment, I wanted to add it’s fascinating to see Apple unveil their new project little by little. The team building and design effort we know of, the industrial scale involved, the communication strategy playing out: every facet of this project is a case study worth our builders’ attention.

Notes

* My thanks to Craig Hockenberry for hosting the file.

September event
59:00 [Cook]
Personal, precise, personal style and taste, because you wear it, what we didn’t do is take the iPhone and shrink the user interface and strap it on your wrist, meant to be worn, as much personal tech as it is style and taste.

64:35 [Ive]
So personal, completely singular product, intimate level, inspire desire, navigation is fluid and vital, fluidly, nimble, precise adjustments, lightweight interactions, quickly, efficiently, connect intimately with others, personal, subtle ways, accurate, all day everyday, preference and self expression, personalisation appearance, to be truly personal.

Posted in

Interface sketching

version française disponible

During a recent project, I put together a tactile interface sketching tool allowing to quickly evaluate layout and behaviour ideas. This post is going through the why and how it came to be, explains how it works, details pros and cons, and ends with a few practical advices.

More and more articles explain to us the primary role of time and kinematics in the production of meaning for tactile interfaces. But as of today, it isn't really easy for a designer to quickly sketch out an interface behaviour for a smartphone or a tablet.

Some existing tools help in this regard by allowing you to link static images one to another using hot spots and animations, recreating a desired experience. For example, taping a menu button on the main screen would slide this screen to the right with a nice ease-out, making room for the menu itself to appear, items fading in one after another.

These tools might provide you with the possibility of manipulating a rough idea of an interface on a device, but they offer a limited choice of behaviours. Only the most “popular” are presented, the same ones developers can implement with little effort when they build an application.

For time and ressources constrained projects, these tools are perfect to get you to a very high fidelity prototype in no time. But if the set of constraints is somehow a bit exotic, and if you would like to experiment new interaction models to answer to them, these tools loose their advantages.

Asking a developer to sit next to a designer to produce interaction sketches which for the most part will be mercilessly discarded - it's their goal in the design process - is a possibility, but it's also a luxury only a small number of companies can afford. For projects where ressources are slim and budgets tight, this comfort is not affordable. Does it make it acceptable to our colleagues that we carry on with explaining our ideas through paper sketches associated with elaborate gestures and sound effects? Probably not.

Origami (graciously shared by one of these almost infinite budget companies, many thanks) seemed very promising to this regard when it appeared. But also very intimidating. The learning curve getting you from total novice to confident in Quartz Composer is a bit steep, and this learning time isn't always justifiable to produce soon to be discarded sketches.
Another pitfall found in Origami is the mouse driven nature of the interactions. To properly evaluate a tactile interface, you should manipulate it directly so you can discover when active zones are too small or too close, or when the hand hides useful informations to complete a task - small but important details that could go un-noticed using a mouse cursor. On the other hand, this tool seems ideal for fine-tuning an animation and transmit its attributes to developers. But if the quality of an animation is crucial to get an interface right, this is not our concern here: we're talking sketching, not finish.

While I was looking for a solution to economically produce interface sketches, Keynote for iOS was a good candidate. Documents can be edited on the OSX version and transferred to an iPhone or an iPad. Direct manipulation is a big win, but where it's deceiving is that you can't actually manipulate the prototype anyway you want. Forget pinches, list scrolls or carousel swipes: any touch on the screen and you're off to the next slide.

But Keynote brought a very consistent advantage on the table: no effort in the production of a transition between two screens. Whatever the shapes, their colors, orientations... as long as the objects are present in the starting and ending slides, the transition is fluid and relatively predictable. It's the well named “Magic Move” transition. I could not let go the time savings this feature allowed so I made concessions on two fronts: holding the interface in my hands, and the relative spotlessness of my screen. The speed I could produce work and the kind of deliverables I could produced were great compensations to these losses.

my smudge covered laptop screen
Designing tactile interface also means touching your screen

It took me about an hour and a half to get from a new document to the prototype shown on the video above, made of 14 slides. Keynote can export a video from the slideshow so you can share your ideas with your colleagues, adding notes using the timecode for specifics.

preview of the slides composing the example prototype
The bare truth of a Keynote prototype

How it works

Everything happens in Keynote, so having a Mac at hand will be a good start. From all the transitions available in this software, the only one used here is Magic Move. A really good name, as in most cases, this is what happens. You build a start state, an end state, you apply the transition and you're done. The time saved compared to solutions where you can edit a timeline is enormous, because there is no timeline to edit.

Where to start

Pros

Cons

Some advice

Conclusion

With a few hundred slides done since last february, colleagues (designers, developers or project managers) having validated this technique value, I thought it was ready to be exposed and shared here. I would be interested in having your feedback, and see if this idea could be pushed a little further so we can all get better at quickly explore interface ideas.

Download the file, tell me what you think. If this has teached me something, it's the importance of bringing an idea to reality as far as possible before asking a developer his time and talents. It might be easier for us designers to visualise mentally, but this ability goes only to a certain point, and a number of small important details can slip through our ideas. It's important to get them out of the way, and this tool helped me a lot in this regard.

The iPhone frame used in the example file is from Pixeden.

If you download the Keynote file, you'll need the Blokk font to read it correctly. It's a great font for placing neutral text in zoning with no effort.

2014.09.03: as @louije reminded me on twitter, Apple had a session about app prototyping during the last WWDC. I saw a mention of this session on twitter the week before, and was quite disappointed I couldn't be there to know if Keynote was part of the recipe. I had forgotten about it until yesterday, and I have looked at the session. It turns out they use Keynote as well, only with much more talent. Take the time to watch it, it's far more didactic and enjoyable than what I've written here. They choose to use Keynote for iOS in their process which I decided to avoid, but the guiding principle is the same: how to decide as fast as possible if an idea is worth exploring further or not.

Read more from the archives