December 22 2013
Michael Darnell writes on baddesigns.com about USB:
When you try to plug it in the wrong way, it doesn't go in. You just need to flip it over and plug it in. But the problem is persistent. I still catch myself doing this occasionally, even though I know it is a problem.
From my experience, I believe that this isn't entirely true. The situation is actually a lot worse. When the connection doesn't go in, the user doesn't just flip it over and solve the problem. The mathematical chance of plugging in a USB correctly the first try is 50%, because there are two ways the user may try to plug it in. However, the real-world chances of plugging it in correctly the first try are actually less than half because of real world conditions.
When plugging anything in, it usually does not go in perfectly the first try. This could be because the user tried plugging it in slightly incorrectly. It is uncommon to position and angle the cable in line with the port on the very first attempt.
This is even more difficult in situations where the connection is not entirely visible. USB connections are often placed on the side or back of a computer, not where the user is typically interfacing with the computer. The connection is also likely to be in a place that is poorly lit, such as a connection on the back of a computer. A user will have a more difficult time plugging it in if the correct plugging path is not the same as the user's vantage point. The connection may also be located between several other cables, obstructing the port.
People familiar with USB have incoming knowledge and experience that the USB connection is not reversible. As a result, after a quick first attempt, a user is very likely to give up and try the reverse. If you were to watch someone plugging in a USB, you'll see that they don't make just one or two attempts. Because of human factors and the conditions under which USB connections are put, users may make several attempts, constantly flipping the connection around to make it work.
No actual testing was done regarding this issue. These are just my observations. If there have been actual studies performed, I'd love to hear about them. Send them my way: @mdznr
September 3 2013
NPR writes about removing references to the world outside the screen in iOS 7.
While removing references to the world outside the screen seems like it a good idea, it doesn't always end up working.
Modern humans have been in existence for 200,000 years. Throughout those years, basic properties about the world we live in and physics have not changed. We have learned how to navigate the world fairly well. We've learned to read and interpret lots of visual cues to create a mental map of the world surrounding us. With a good mental map, we can figure out how things work. This is great! Without some visual cues (like shadows, perspective, relative size), it would be hard to navigate our world. For example, it would be very difficult to differentiate between a staircase and a ladder. Imagine walking up to a staircase, then moving your arms and legs vertically and failing to climb the stairs. How embarrassing and frustrating!
In iOS 7, a lot of those very basic visual cues we're used to are still there, and you might not even realize it. One that often goes unnoticed is occlusion. Occlusion says that elements that are "closer" to the viewer are visible and block out (or obscure) elements behind it. It seems so trivial. iOS 7 still very much uses this principle. App icons appear over the wallpaper, App icons appear on top of the dock, alerts appear above apps, and the list goes on and on. We rely on these cues a lot more than most people realize. So much about our environment can be learned just by these cues. iOS 7 even adds some cues like parallax to the OS to give a slight hint at depth in the OS. This stuff is great! It won't be consciously recognised by a lot of users, but it will add to the feeling of depth and vitality in the OS.
NPR writes that shadows have been removed from the OS because they're no longer necessary in this new digital world.
If you look at the buttons on the keyboard you'll see that they are very subtly three-dimensional. They even cast a shadow that you can see, if you look closely. And they're rendered with a fictional light source that hangs above the gadget itself.
In previous versions of iOS, shadows were used to hint at depth. The keys were on top of the keyboard frame. If the light source from above the device was removed, then why are there still shadows? Yes, in iOS 7, there are still shadows below the keys.
However, instead of the soft shadows (light diffuses over distance) on previous versions, the shadows are sharp and just vertically offset 2px (1pt). This is something you would not see in nature, and is confusing. Now it doesn't seem like a shadow on the keys and looks like the keys are now lined with a thin bottom stroke or are sitting on some short of shelf or container. It's ambiguous. The entire point of those shadows has been lost and instead of being informative (visual cue for depth), the new design is now inaccurate, and confusing.
Shadows still exist in iOS in some places. Some of the best examples are in seen in sliders (the thumb has a shadow to suggest it is above the track and can be moved), and switches (similar to a slider, but only has two possible states: off/on). This is incredibly important! These are subtle hints at how to interact with UI controls that appear across so many apps. Shadows are also present in view controller animations, Passbook passes (app and icon), Camera app icon (inner shadow on the glyph), Settings app icon (inner shadow on glyph), below the yellow header in the Notes app icon, below the Video app icon's black and white clapper board, on the tabs in the Contacts icon and more.
So has the lighting source on iOS 7 really changed? It doesn't seem so.
Has the way humans physically see the world and create mental maps of objects in their environments suddenly changed overnight after being roughly the same for 200,000 years? No.
Yes, I realize people with very poor eyesight or none at all have also been able to survive, too. There have been great technical improvements in iOS 7. A lot of the design changes shouldn't negatively affect those without sight. The technical improvements should benefit all users.
All information in the article regarding iOS 7 is publicly available and was presented in the WWDC Keynote or on Apple's website.
August 20 2013
Photos.app and Photo Stream Confusion
I recently came home to Connecticut after living all summer in California. While home, my family decided to show me pictures of their vacation. While the vacation photos were wonderful, let me tell you how much of a headache it was to show them. My family took all their photos on an iPhone. They knew that the iPad was able to show the pictures, and wanted to use that because it would not make sense to huddle around a little 3.5 in. screen and go through the camera roll. They took out their iPad, which should just have all the photos since the same iCloud account is used on both devices. My mom opened up Photos.app on her iPad and kept scrolling through all the photos and didn't see the vacation photos. While I knew they would be in the Photo Stream tab, I wanted to see how my parents would approach this problem. After a while of that, they just about had enough and were going to take out their iPhone. At that point, I had enough and stepped in. I had to explain the complicated topic of iCloud and syncing. While doing so, I realised that this issue I have with Photo Stream isn't just a small issue that I saw with it. It is a *huge* issue that's preventing people from doing such a fundamental task on their iPad. Why were we about to crowd around a tiny screen, when a several hundred dollar device with a gorgeous 9.7 in. screen was laying idly on the kitchen counter? iPad is incredible and it's more than capable of displaying my family's vacation photos. However, it was just too complicated to figure out.
After telling them to use the Photo Stream tab, we began looking at all the photos. We then came across a photo that was poorly lit. My mom tapped on "Edit" and did an automatic adjustment on the photo. The photo became a bit brighter and easier to see. Then she tapped "Save". The photo reverted to how it was just a moment ago. What happened? Photos in Photo Stream can't be edited, but they can be saved to the camera roll. If we wanted to see that edited photo, we would have to go back, find the edited photo, then return back to Photo Stream to continue scrolling through the rest of the vacation photos. Now we have four copies of that photo. We have a copy sitting in the Camera Roll on my mom's iPhone, a duplicate copy in Photo Stream, an edited version sitting in the Camera Roll on my mom's iPad, and a copy of the edited version at the very end of the Photo Stream. This is unacceptable and becomes even worse when doing this for more than a couple photos. To them, this should simply be *one* photo, and their devices should all just be a window into seeing all their photos, regardless of which device they were taken on, and where they edited the photo.
After seeing all the photos, they wanted to share it with other family members that weren't home so they could see them, too. It's too much of a hassle to describe how confusing that was to explain, so I'll just move on to how I would like to solve a lot of these issues.
My proposed solution
The way Photos.app and Photo Stream works really needs to be fixed. Photos.app should have four tabs: Photos, Albums, Faces, and Places.
In Photos, this is where all photos taken on any device with the same iCloud account go. Here they are intelligently sorted and grouped by time, and place, similar to how it works on iOS 7.
In Albums, you get a list of albums. Albums are hand-picked groupings of photos. There is no longer a difference between an album and Shared Photo Stream. All albums are technically Shared Photo Streams. Whether or not you want to share it with other people, is up to the user. Having them all technically "Shared" has the advantage of syncing all albums across devices.
Faces is just what you expect: collections of photos grouped based on who is in them. This will work similar to how it does on iPhoto for OS X.
Places is also just what you expect: photos organized by where they were taken. It shows a map with groups of photos in different areas that they were taken. Pinch to zoom in on an area to get more fine-grained control of the groupings of photos. This will work similar to how it does now.
Don't even get my started on trying to get some of these photos printed and framed.
Yes, I know it would've been easier to just AirPlay a slideshow from the iPhone to the Apple TV in the living room, but it's not a solution to the issues mentioned above.
December 10 2012
Why multiple animations? Pick one. And don't pick the cube one.
The pull over animation should not have transparency behind it— it makes it look messy and something you'd see in Windows 7. No.
Simply swiping away notifications is something that would be very inconsistent with iOS. In most apps, swiping left or right will bring up a delete button, but swiping alone won't delete it. No.
On the Mac, how would you remove notifications? Clicking and dragging? No.
October 29 2012
Ballmer on Tablets
My goodness, Steve Ballmer is an idiot. Ballmer claims the Microsoft Surface is the only usable tablet, yet it ships with a beta desktop version of it's most popular software, Office. It's perhaps the main reason why any Office-using PC guy would want to buy one of these things. I can't tell if he truly believes this or if he's just trying to push the device. They're trying to push their new app platform formally-named as Metro, yet they can't even ship the device with Metro Office.
Just look at how usable this thing is!