Team summer presents
Ratings
A project with the goal of using ratings to help our customers provide guests with an even better on-site experience... with a few derailments along the way.
Problem statement

The rating system had a lot of untapped potential in terms of quality and value. We were to drill down in feedback because it is believed that we could help our customers provide a better experience for their guests, hopefully leading to a boost in sales.

In addition, the way feedback was previously collected lead to some inconsistencies regarding who it was given to and what it was about. For instance, our customers might receive feedback directed at Ordr, or their average rating might drop due to Vipps downtime.

Identified problems

  • Ambiguous wording. All parties were unsure of who the subject of the ratings was. It did not state whethter if it is the Ordr-webapp or the place of service that was being rated.
  • Unutilized potential. The feedback recieved was unclear and of relatively low value to both customers and Ordr.
  • Premature ratings. How can you rate things you haven't experienced? For instance food that comes later or service not yet recieved from waiters?

[more description of the different problems that we encountered]
[inserting a short video of original rating system?]

Our Solutions
1
Create more engagement
Engage users to conduct a rating. The goal is to increase ratings per transaction to 10%.
2
Utilize rating-data better
Utilize rating-data in the most useful way possible for customers and guests. Collect more specific feedback on multiple areas: product, service and app.
3
Work towards the North ☆
Contribute working towards our North Star. Ensure that rating-related changes do not interfere with other important aspects of the confirmation page.

Work process
We have worked towards our goal over seven weeks and gone through six different phases.
1
Discover
We got to know Ordr and started to discover how the user journey works. Further on, we tested in QA with different locations and tried to get the whole picture of the Ordr-experience. In this way, we got insight into the problem and discovered major pain points along the way.
2
Define
Defining the problem and scoping down the focus was of great importance for the process. This way, we defined our main goal and focus areas for the project, and customized the task considering the time we had available and what was realistically achievable.
...
3
Design
The summer team brainstormed how we could solve the problem and started to ideate different solutions. Several mockups were designed, and both qualitative and quantitative user tests were carried through. Developing potential solutions and testing them along the way, gave us argumentations for different decisions that were made.
4
Build
Coding and further developing were done to build the solution. We focused on the lean startup methodology and worked after the "Build-Measure-Lean"-principle so that we could produce several changes.
5
Iterate
We made iterations as we got feedback from user testing. Important data was also used to confirm or deny various hypotheses we had.
6
Deliver
The final product was delivered the last week and the final feedback was received.

Experiments and
releases
Over the last 7 weeks, we have had experiments and user tests to track changes and check our hypotheses. Week 1-2, we had responsibility of ratings and released emojis. The following week, we took responsibility of the whole confirmation page and released three categories. In week 4 and 5, we redesigned "New order" button, "Order details" and the rating section making it into a modal. In week 5, we also enhanced "New order" button and utilized the space on the confirmation page better. Week 6 consisted of A/B testing the display for 1 modal vs 3 static categories, A/B testing order details collapsed vs expanded and registering rating on each click. This last week, we displayed a new rating content in admin, released emoji animations and changed top rating from "what can we improve" to "what did you enjoy most".


Week 1-2 - Emojis
Week 3 - Three categories
Week 4 - Redesign "New order" button. Redesign "Order details"
Week 5 - Redesign rating section to modal. Enhancing "New order" button. Utilize space on confirmation page better
Week 6 - Register rating on each click. A/B test displaying 1 (modal) vs 3 categories (static). A/B test order details collapsed vs expanded.
Week 7 - Displaying new rating content in admin. Emoji animations. Change top rating from "what can we improve" to "what did you enjoy most"
Week 1-2
Redesigning the stars into emojis

We assumed the use of emojis would create increased engagement with guests. We belived that it is easier to relate to emotions compared to stars. We also thought that the increase in colors would draw more attention towards the rating.

User testing at Oslo Street Food

The second week, we visited Oslo Street Food food court to user test the transition from stars to emojis. The testing was qualitative and there were five interviews that was carried through. We had a new version that was released to production and several versions in form as Figma-prototypes. This way, we got to user test and interview guests to see what they preferred.

  • We tested different colors, one with blue as the best, one with favrit-colors and lastly one with inspiration from traffic-light-colors.
  • We also tested different expressions, testing how well the users managed to see and understand the belonging emotions of the emojis.
  • Additionally, we had some other versions with one color for the emojis and only the outline marked to check other variations.

The results of our observations showed us that most people do not rate. Also, it showed that the users prefer traffic-light-colors together with the clearest and biggest expressions of the emojis. The users did not want to interrupt the visit by using time rating the experience and therefore, the rating should aim to run as smooth and fast as possible. still giving important and specific feedback to the customers of Ordr.
Week 3
Redesigning the rating into three categories

Redesigned the rating into three categories with belonging questions and emojis. Having clickable category options will make it easier faster to give a more specific rating, which might lead to more users doing so. This will create more value and insight for customers, as the feedback is standardized and categorized.
This week we redesigned the rating section into three categories, with belonging questions. Our theory was that having these categories and sub options would make it easier and faster to give more specific feedback, which might lead to more users doing so. Communicating what they are happy or unhappy with will be easier as they are offered these predefined, standardized questions and sub options. This would also create more valuable insight for our customers.



At this time there was also a release with new designs on the confirmation page, which were somewhat conflicting with ours. It also just made sense for us to take ownership over the entire confirmation page moving forward, as it would be easier for us to do experiments with the rating section.



We also went to Vippa for some interviews and user testing. We didn't only focus on the rating system, but the entire confirmation page. We also tested a prototype visualizing designs for modal based rating



All in all, we collected a lot of valuable feedback for the entire webapp and for the confirmation page, we experienced that a lot of guests did not use much time on this page. The majority were interested into seeing if the order was confirmed. Most enjoyed having the rating split up in bite size chunks through the modal rather than having it all served at once with the three static categories. On the other hand, it was mentioned that the process felt a bit long. Several mentioned they liked being able to choose alternatives within each category.

Week 4
Redesigning elements in the Confirmation page
The "New Order" button was redesigned into a floating button that expands when the window is on top, and shrinks to a "+" sign when the users scroll down.

Elements in the The Order details box were rearranged and a collapsible function was added.

Finally, whitespace was removed to compress it as much as possible in order to show the rating box under.

User testing at Delicatessen
The fourth week, we had four rounds of interviewing and user testing the guests. In this user testing, we wanted to test if the users understood how - rating in the modal window worked,
- together with testing all the components inside the modal (buttons, progress bar, steps and labels).
- In addition to the redesigned New Order Button
- and the collapsible order details box.

This user test was important to gain insights into what the users liked in this version of the page in addition to insight into some pain points in the redesigned elements. Here we found that
- The users liked the collapsed version of the order details box, they also found the rating in a modal window clean and intuitive.
But some pain points we found where:
- that the button to open and close the order details, was sometimes hard to find, especially sitting in the sun.
- We also found out that the users didn't notice much the New Order-button, and that they thought it was easy enough to just scan the qr-code again.
- Finally, they thought that asking for improvement suggestions if they rated with the best score was unnecessary.

All in all the feedback we collected was hugely valuable as we found which areas to improve and which areas were working as intended.
Week 5
Redesigning rating section to modal. Enhancing "New order" button. Utilize space on confirmation page

Our theory is that compressing the three categories back into one will make it less cumbersome to start a rating. Having clickable options will make it easier and faster to give a rating, which leads to more users doing so. It will also create more value and insight for customers, as the feedback is standardized and categorized.

This week we were also asked to assist Team Guest with conducting a test of the collaboration feature at Delicatessen.

[insert key insights from testing at Deli-Collaboration here]
picture!
Collaboration testing at Delicatessen
In week 5, we had an internal testing of the collaboration feature with Ordr employees. We tested four scenarios which alternated between which part that was opening the bill and which part that closed it. The scenarios consisted of:

  • Guest opening and closing bill
  • Waiter opening and closing bill
  • Guest opening the bill and waiter closing the bill
  • Waiter opening the bill and guest closing the bill

We worked tight with three waiters from Delicatessen to collect all the feedback and questions that we could possibly get. The feedback and insight we got from the Ordr employees and the waiters showed us that the possibility of the collaboration feature should be even more clarified. At the same time, we also got a lot of inspirational ideas for what that can be done and all the possibilities that exists for the collaboration.

    The Innovators:
    How a Group of Geniuses, and Geeks Created the Digital Revolution
    Week 6
    Register rating on each click. A/B test displaying 1 (modal) vs 3 categories (static). A/B test order details collapsed vs expanded.

    We could see from tracking events in Google Analytics that because the rating only gets submitted after the user clicks on the "send feedback"-button, alot of the ratings gets lost as many of the guests only click on the emojies. We therefore updated the functionality to register feedback straight away and as expected we saw a significant increase in ratings.

    A/B Testing
    We successfully conducted two A/B tests:
    A collapsed vs expanded Order Details box, and a modal vs static Rating box.
    Our hypothesis was that the more interactive modal version generates the most valuable feedback, but we wanted to see the effect from both of them and therefore conducted A/B testing on both on them.
    Week 7
    Displaying new rating content in admin. Emoji animations. Change top rating wording from "what can we improve" to "what did you enjoy most"

    A goal was to increase the value of the ratings. We therefore spent a lot of time during the summer developing the admin dashboard to display the new data we are collecting in a way that is useful to the customers. Along with team marketing we also decided to focus on what the customers liked instead of disliked when giving top rating. We think that in the future this could be valuable as selling points both for restaurants and Ordr. We also animated the rating emojis as an attempt to draw more attention and create more engagement. Finally we managed to add the possibility for rating in the ordr-receipt.



    Image from Gyazo

    Final product
    After seven weeks of hard work we are proud to present the final product, consisting of a revamp of the confirmation page and a whole new feedback dashboard in admin.


    Vision moving forward
    Everything we'd love to do, but couldn't do or didn't have time for:
    Product rating
    Allowing guests to rate individual products, giving us valuable data ... - for guests, customers, and suppliers.
    Showing ratings on ordr.no
    Guests can find locations on ordr.no, but have no easy way to judge their quality. Showcase... sort... n
    User integration
    With a user system we could give recommendations based on previous ratings and friends' ratings, and let you rate your previous orders.
    Out of scope
    • Interaction between collaboration feature and rating
    • Investigate the possibilities of "Send receipt by email"
    • Investigate hotjar/GA user testing
    • Add the point of time for tracking of the rating and show it in admin
    • Show rating for each result
    • Sort the rating results
    • Show product ratings in the menu
    • Investigate opportunities for collaboration with waiters at the eater
    • Insert filter for three categories in Power BI
    • Add the ability to rate products
    • The guest can choose to leave contact information on the rating
    • Button to verify order on confirmation page. Updates delivered status in the order log, and vice versa
    • Show product rating for customers in admin
    • Show product ratings for suppliers
    • Tip emojis for checkout page
    • Accessibility improvements in feedback dashboard - shades and other patterns to distinguish the rating summary of the donut

    What we have learned
    At the start of the summer we learned that in Ordr we set goals that have a 50% chance of succeeding. We set our selves a big hairy goal, and worked hard to achieve it. Inspired by Edwin we know that "The impossible is just something that has not been done yet". We have learned to work with each other both as individuals with different backgrounds, and as a team working towards a common goal.

    We have also had a lot of fun along the way. The weeks in the Ordr-office have been filled with laughter and fun nights. Thanks to Eirik H, we will always remember that when money is tight "You can get full on beer, but you can't get drunk on oatmeal".
    Made on
    Tilda