Perfect Holidays


Perfect Holidays


Selling a holiday is one thing. Selling the PERFECT holiday is quite another.



Case Study


On the Beach 


Lead UX Designer


Award Winner

The Problem

On the Beach has spent the past 10+ years optimising the “Flight plus Hotel” booking path. With the recent changes to the Package Travel Directive, allowing On the Beach to sell package holidays, the “Holiday Deals” booking path needed some TLC. Searching and filtering for a holiday has been optimised with great success, so now that users are able to find the most relevant deal, the “Holiday Landing Page” needed the same care and attention to allow users to edit the deals to create their perfect holiday.



Now that On the Beach are able to provide package holidays, users are able to take advantage of exclusive pricing whilst customising their holiday to meet their exact needs. The old Holiday Landing Page offered editability but not in an obvious way; the page structure was confusing and the experience differed considerably between desktop and mobile.


Users arrive at the Holiday Landing Page in multiple ways; performing a search on the homepage, deep linked in email and soon from third party comparison sites such as Trivago. Due to the variety of traffic involved, people from all across the business had an interest in the redesign of the page, all with specific requirements and needs. Some based on a business need, others based on providing additional value to the user and some even provided as a set of guidlines by Trivago and TripAdvisor for optimising these types of pages. As a result, collaboration was key in this project so that all stakeholders felt heard, bought in to the proposed direction and were happy with the end result.

Jasmine 2.0
Darren’s Wife
Louise 2.0


Design Thinking is our collaborative systematic approach to handling problems and generating new opportunities. There are 5 stages – Empathise, Define, Ideate, Prototype and Test. Every stage needs input from multiple roles, so collaboration is not only key, it’s essential. Each person involved has a unique perspective with varying levels of insight, interest and investment in the project. Spending time on each stage ensures our solutions are user-centric and always focused on the right problems. The process is a flexible framework, which means we don’t always begin at the Empathise stage and work through to Test. Often, we already have a good idea of the defined problem or opportunity so we can start anywhere and work towards Test. It also means we can “loop back” into a stage once we’ve learned something new through testing.


Joint Application Design (JAD) sessions are now a solid staple of the Design Thinking process here at On the Beach. In these sessions, key stakeholders, designers, FE’s, Rubyists and anyone else who has an interest or can bring value to a project is invited to understand the objectives, requirements, limitations and desired outcomes. Everyone has a voice and can contribute towards creating a direction to ensure success. But most importantly, these sessions create “buy-in” for the chosen direction, getting everyone involved in the life-cycle of the project right from the start. Immediately, we all know who is who and how they can help, so in theory streamlining the process as we gather momentum. In this instance we had 16 people in the session, maybe too many, but we can’t argue that the outcome of the session didn’t put us on the right path from day one.


When it comes to data, On the Beach has a great process for split testing new features and design changes before rolling out in full. The same can be said when it comes to the initial empathy and defining stages of the Design Thinking process for most work. Our analytics team are able to provide detailed information regarding clicks, interactions, engagement, progress, conversion, scrolling behaviours, sign ups, drop-offs etc. So we use as much of this information to validate our hypotheses and drive our decisions. Should a button be more prominent? Is the structure and order of a page correct? We let data guide us before testing our ideas and designs with prototypes.


At On the Beach we are extremely fortunate to be able to user test using a variety of methods. We test internally with our Contact Centre staff (they use the website on a daily basis so we consider them to be our Super Users), we guerrilla test, we remote test online and we test formally in a lab. Depending on what we are testing and what we want to achieve from the test, we choose the most appropriate method. In this instance we tested our concepts and ideas using all of the above methods, involving all stakeholders in the preparation and outcomes of each test, and iterating and amending each time.

Test 1 – internal sense check with Contact Centre
Test 2 – guerrilla test in Manchester
Test 3 – internal sense check with stakeholders
Test 4 – formal lab-based test in Sheffield
Test 5 – remote online test
Test 6 – guerrilla test in Manchester
Test 7 – formal lab-based test in Sheffield
Test 8 – internal sense check with Contact Centre
Test 9 – internal sense check with stakeholders


Following all of the testing, iteration, amending (and more testing), we felt confident that the designs, page structure and new features would achieve all of the objectives set at the start of the process. Through the whole project, conversations had been taking place between design and development to ensure that there would be no blockers when a final page was “signed off”, and to also allow the development team to start working on the back end and some fixed elements. Again, this collaboration meant that there were never any unforeseen or otherwise avoidable delays added to timescales, but more importantly it meant that both teams, design and development, could work seamlessly in tandem on separate, but connected workflows.


On the Beach takes an agile approach in its workflow and as such, this means releasing new product updates often and iteratively based on split test data. With a full page re-design, it becomes very difficult to do this without releasing the full page (test) in one go and testing against the original (control). So, a creative view was taken to split test on this occasion. As well as testing against user data, hypotheses were tested step by step to ensure that new functionality, features and page layout worked individually before the full page was released.


Step 1 was testing the new headers (1.02% increase in progression). Step 2 reordered the page structure so all editibality options are in the same place (interaction increased across all elements by minimum 4.67%) and finally step 3 incorporated the sticky headers and CTAs (early data shows an increase in conversion rate equating to £2m per year). It has brought team members from multiple teams closer together, given us a greater understanding of what we do and how we can help each other and, in doing so, allowed us to identify where problems might arise sooner and how we can deal with any limitations to break new ground.



Perfect HolidaysCase Study


Dating AppCase Study