Recently I took my iPhone into Apple because the camera viewfinder was shaking crazily and making noise. I started with the Apple technical reference on their website, then the chat room with someone at Apple, and then finally after a series of phone resets determined that I had to go to the store. I feel Apple experience has slowed down a bit as they push to not have people go to the Apple Store for service. Obviously, people want to go there without trying to any solution, and Apple wants to save money by pushing ‘online’ resources. In my case, I felt getting an appointment at the store was substantially more arduous than it used to be in years past.
Once at the store for my scheduled appointment I went to the normal person at the front door and was checked in. I eventually went to the back of the store and then waited for someone to come out to “re-discuss” yet again my problem. During this discussion, the Apple person had an iPad and started the process of filling in forms. To Apple’s credit they always incrementally improved their process. This time she used a “scanner” via the iPad camera to capture my phone's identification information. This minimized data entry, especially with tedious phone id character codes.
Unfortunately, however, the Apple person and I had to continue to fill in more forms to get the right information so they could fix my phone. The form that we used asked various items twice, for whatever reason, and asked questions that had limited answers that caused confusion on both our parts in trying to complete the process. Overall, I was asked the same questions probably for the fifth time in the process.
Eventually, we finished the forms and she said it would be fixed in 1.5 hours. Awesome! And it was!! A new camera module.
Here are two ways this could be handled in the future.
The first way, I walk into the store and know I am dropping my phone off to be fixed. I know because of my previous online chat/etc. The Apple store person at the front directs me to a kiosk. The kiosk scans my phone, gets my history and maybe asks a few questions. It does this fully using natural language. No forms or normal interface navigation other than my signature. When it determines I need my camera fixed, it asks me to place the phone in a secure drawer in the kiosk, prints a pickup ticket and tells me to come back in 1.5 hours. If anytime in the process the kiosk is not capable or I need a “person”, it directs a person to come over.
The second way is to replace the kiosk with a person. BUT, the Apple persons and their iPad is the same interaction as the kiosk. The person and I interact using language not pecking at the iPad interface. If more diagnostics are needed then it is done but again via language. The whole process is recorded so I/Apple can review later as needed.
The key point is replacing user interface steps with natural language processing. This makes it easier for the customer AND allows Apple to hire a wider range of skilled people as they don’t need to be trained in the current complex interface. This should also reduce customer wait times (a big issue right now) as it’s more scalable. Very important.
So, Apple, spend the bigtime $ in the bank on this!