Analyzing User Feedback: How to Interpret and Apply Insights from Website Usability Testing
Remember playing those “Spot the Difference” games when you were a kid? You’d open up your puzzle book or flip to the back of the cereal box and be presented with two pictures that were almost identical. The following 30 minutes were spent intensely scanning both pictures to find the differences between the two — a different colored flower pot, four trees instead of three, or some other frustratingly small detail.
Updating and testing web page changes can be a similar exercise. Our web team is often tasked with helping our clients make updates to their websites, switching out data, and updating the code. We then test each page to ensure the pages are correct — a time-consuming manual process.
When our web team was tasked with helping a client make several updates to profiles housed on their website, we realized we could streamline the process and save our clients valuable time and money. Our team set to work developing a better way to test data changes on web pages, making testing quicker and more efficient.
We first asked ourselves the question, “How could we compare an old page versus the updated one?” At first, we considered comparing just the information between databases, but that would not take into account any additional logic that the profile pages contained. That left us with a single option: testing the visual difference between the pages.
Our clients are often updating many pages at once, meaning there is a lot of information to review and double-check. For this particular project, we were looking at almost 2,000 pages – a project that would take roughly 167 hours to review manually.
We realized manual testing is not very effective and automation could free up both time and resources, giving our team and our clients back valuable hours for other projects. Thus, we decided to create a program that could compare web pages and create a visual difference map between the two in a fraction of the time.
Before we began, we decided to plan out the goals of this automated test.
Compare a visual difference between page A and page B
Take screenshots of each page and create a visual difference map between the two
Export a CSV of the status of each page and whether it passed or not
Make this reusable
Introducing Visual Differ, our new tool for quickly testing changes to your webpage!
Visual Differ takes a URL, or list of URLs, and creates a screenshot of the old page and the new page. It then compares the two and creates a third screenshot that visually shows any differences between the old and new pages, along with CSV of the status of each URL after the program has run.
So how much time does Visual Differ actually save?
Each test takes approximately four seconds to complete, from the initial screenshot to creating the visual difference map and CSV. For our doctor profiles project, we were able to run a full test of the 2000 pages in 2.3 hours — nearly 75 times faster than manual testing.
With Visual Differ, you will also have a complete record of your testing with a folder of screenshots and a CSV automatically generated, allowing you to quickly review any discrepancies in data and resolve them.
Visual Differ drastically reduces the time it takes to run manual tests on updated web pages, taking the stress off of our developers and clients. In turn, we now have more time to dedicate to other projects and produce better, more valuable results for our clients.
To see the Visual Differ tool in action, view the full code on Github.
To learn more about how TrendyMinds can help transform your website, check out our full web service offerings.
Note: This test was run on a gigabyte fiber optic connection. Test times depend heavily on the users’ internet connection, and how fast the target servers respond to connections.