Ola for everyone: UX case study
Redesigning Ola to accommodate those with hearing and speech impairments.
Why Ola?
Most Indian suburbs rely on auto-rickshaws for their day to day commute. Since Ola provides the option to book auto rides from the app, it is more popular amongst Indian users.
Background:
Roads and neighborhoods in India are complicated enough that the GPS systems too, sometimes fail to accurately locate someone. Because of this, most drivers on Ola usually end up calling the passengers to ask and confirm their exact location. Although it works fine most of the time, it still leaves out a certain group of passengers i.e. the ones who have hearing and speech impairment. Ola does indeed provide an alternative to send text messages to drivers, other issues such as possible differences in language, education level make it very hard to communicate the simplest instructions.
If anyone is unable to pick up the call or let the driver know their exact location, they end up having their rides cancelled. Because of this, deaf and dumb passengers usually have to rely on someone else to book their rides or prefer to not use such apps at all.
Problem Statement:
Ola currently does not have a specific feature to help people with hearing impairment, and neither do they provide any way for them to let their driver know about their disability, without having them tell it themselves when they get in on their rides. This makes people who are hard of hearing hesitant to use the apps on their own.
What I want to do is to make it easier for the hearing impaired passengers to book and get their rides, and encourage them to be independent.
Process:
User Interviews:
The target users here are passengers who have some type of hearing and/or speech impairment.
Since it was hard to interview users because of the lockdown, I conducted an informal interview with 2 -3 users and asked them questions about their experiences using a ride-hailing app, the reasons why they don’t like to use it, and what would they like to see in such apps.
Along with this, I also went to various forums like Reddit, Quora, etc. to understand if other users are also facing similar issues and what they feel about it.
What did I understand from this?
- It is hard to get bookings confirmed on Ola since the drivers cancel it immediately after trying to call them.
- It is difficult to explain proper direction on the text and most of the time the drivers do not see or acknowledge their texts as they are in the middle of driving.
- Have to rely on someone else to book their rides and talk to drivers. This reduces their confidence to go and try getting a ride by themselves.
- There is no other way to let the driver know that they are disabled before getting in the ride and telling it themselves. This causes a lot of miscommunication and delay since the driver usually asks for OTP before the ride begins and the passenger struggles to understand and tell them the number.
Competitive Analysis:
Right now I was able to find only two apps that had dedicated features for the hard of hearing users; Lyft (works only in US and Canada) and Uber ( Works in India).
Both of the apps have features mainly for deaf drivers while the passengers are only provided with the option of using text as the mode of communication.
Ola as of now doesn’t have any specific feature for anyone.
User Persona:
User Flow:
Design:
I started with sketching out wireframes on paper and incorporating the additions to the existing app. I tried not to have any drastic differences in interfaces for the deaf user, but instead focus on how the existing flow can be improved to accompany their requirements.
I mainly addressed these 3 pain points for my design:
Pain Point 1: There is no way for the passenger to identify as Deaf from the Ola app.
Solution: Provide the passenger to identify as deaf or hard of hearing on the app which would also be shown to drivers when they confirm their ride.
Pain Point 2: Need a better way to let the drivers know their exact location without having to call them.
Anytime anyone asks you for direction, you rely on describing the place visually to the other person, like telling them to come to the main gate of a red building, or I’m standing opposite to that big blue shop, etc. Once someone is able to visualize, it becomes easier to identify the place. Since describing something verbally is not possible here, there is one more efficient way to let the other person know where they are.
Solution: Provide an option to click a photo of a unique identifiable landmark near the location that would help the driver to identify the same. The passenger can click it once the ride gets confirmed as well as the driver will be able to request a photo from the passenger.
Along with this the call option too will be disabled once the user identifies themselves as deaf or hard of hearing so as to not have them overwhelmed if the driver tries to call.
Pain point 3: Passengers find it hard to communicate important details like OTP to the driver.
Solution: Have a pop up with the OTP large and visible enough for the driver to see that would open as soon as the driver reaches the passenger's location. that way the passenger will also know that they have to show it.
Learnings:
I learnt a lot about accessibility and what problems people face for the tasks that seem otherwise simple. The biggest challenge was to make the accommodations for accessibility without affecting the existing user flow of the app.
Even though I have considered only the hearing impaired passengers for this study, I realized that there were other people who faced similar problems regardless if they were deaf or not. So I tried to not make the features too specific that only the people with the said disabilities use them, but anyone trying to book the cab will be helped.
I do wish that I was able to see these features work in real life scenarios with real drivers and passengers and see how much it improves the experience for the users.
Since this is my first case study there is still a lot of scope for improvements So please feel free to let me know where I could have been better. Thank you for giving your time :)