McMaster-Carr

Website Evaluation and Usability Testing

Role: End-to-end UX Researcher

Project Duration: 9 weeks

Introduction to McMaster-Carr

McMaster-Carr is an industrial supply company that provides a vast range of products, including tools, hardware, and electrical components. They are known for their extensive online catalog and information packed website. 

Intended Users: 

McMaster-Carr’s intended users are professionals such as engineers, facility managers, machinists, contractors, maintenance personnel, and procurement specialists who need a reliable source for industrial supplies and equipment. 

Problem Statement and Goals

McMaster-Carr faces several usability challenges that affect the key actions users typically perform on the website. Early exploration of the websire uncovered issues with confusing layouts, overwhelming amount of information on pages, and lack of visual hierarchy. Given this, we came up with a few goals to guide our research:

Goals:

  1. Identify key problem areas (e.g. Product search, Navigation and Filtering).

  2. Measure the ease of navigating the website and performing tasks. 

  3. Assess user satisfaction and overall experience with the website. 

  4. Evaluate accessibility.

Heuristic Evaluation

To guide our initial research and discover the shortcomings of the McMaster-Carr website, we conducted a heuristic evaluation. We used Nielsen’s 10 usability heuristics to identify where parts of the website violated the guidelines. Some of the biggest concerns identified in the cognitive walkthrough and heuristic evaluation are related to information clutter and navigation. The website’s layout, particularly in product listings, was considered a level 3 severity due to the overwhelming amount of detailed information, lack of visual hierarchy and aesthetic design.

Additionally, accessibility issues were observed in the form of small font sizes, low contrast, and poor readability. Another major issue is the lack of clear system status which hinders navigation and task completion. 

Research Questions

With our proposed goals, we came up with the following research questions to address in our usability tests:

  1. Does the website’s navigation help the user reach their desired product specifications? 

  2. How does the user’s path differ from the expected or ‘happy path?’ 

  3. What challenges did the users face while navigating the website? 

  4. Did the users receive clear feedback for their actions? (ex: adding a product to the cart, finding specification using filters, choosing color options) 

  5. Did the users notice this feedback? 

Participants

We recruited users for our usability testing who we believed would be most inclined to use the McMaster-Carr website. The requirements are as follows:

  1. Participants must be over 18 years old. 

  2. They have used or are familiar with similar hardware websites to McMaster-Carr. 

  3. They can speak and read English fluently. 

Usability Testing

We conducted moderated usability tests with the 5 participants who qualified from our screener. The basis of the usability tests was to evaluate how users would complete specific tasks on the McMaster-Carr website to see how intuitive it was. The selected tasks were as follows:

Task 1: Find and view ‘Aluminum Threaded-Hole Ball knob (3/4”)’ 

Task 2: Download a ‘3-D STEP’ file for ‘Aluminum Threaded-Hole Ball knob (3/4”)’ 

Task 3: Add to cart – Find out the fastest shipping option 

These tasks were intentionally chosen to evaluate the information architecture, navigation, and layout of the website, as these were things we found needing updates from our heuristic evaluation

Evaluation Measures

In our usability tests we aimed to measure results quantitatively and qualitatively. Some of the quantitative metrics we collected included task completion success rate, task completion time, and System Usability Scale (SUS) rating. Qualitative metrics which were collected included post-task interview responses from participants.

These SUS statements were rated on a scale from 1 (strongly disagree) to 5 (strongly agree)

Results

Based on our usability tests we uncovered the following results:

All five participants successfully completed all three tasks, with Task 2 being the fastest and Task 3 taking the longest to complete. Errors increased during Task 3 due to unclear shipping options, resulting in a longer completion time.

The post-questionnaire included 10 SUS statements rated on a scale from 1 (strongly disagree) to 5 (strongly agree). The average of five participants rated the system as easy to use without the assistance of a technical person. However, they provided lower ratings regarding the website's functionality and design appeal, expressing concerns about the cluttered UI design and inconsistent navigation. A SUS score above 68 is considered above average and suggests good usability, while a score below 68 is below average and indicates usability issues. 

Findings and Recommendations

  • Simplify Navigation and Minimize Clutter: Consolidate menus and reduce white space to make the interface visually appealing and less overwhelming. 

  • Improve Delivery Information Accessibility: Provide clear access to delivery options in the “Order” section and enable users to view estimated delivery times as guests. 

  • Enhance File Download Visibility: Make the download button for files, especially 3D STEP files, more prominent with clearer labels. 

  • Offer a Guest Checkout Option: Allow users to find shipping options and complete the purchase process without account creation. 

  • UI Adjustments for Aesthetics: Redesign elements with improved spacing and aesthetic structure to make the interface more attractive and accessible. 

Conclusions

Overall, our usability test provided us data which pointed to several aspects of user experience which could be improved upon. Some tasks such as tasks 1 and 2 were relatively easier to complete, but conveyed user feedback such as a desire for more aesthetic design and a less cluttered organization. Task 3 showed us the need for guest checkout or an easier way to access shipping information and other details, to alleviate user frustration and increase task completion time. Expanding our usability test to a larger participant group would likely help us solidify our results and findings. 

The McMaster-Carr website performs well in terms of search and filter functionality, but major improvements in navigation, visibility, and aesthetic appeal are necessary. These findings suggest that users value clarity, ease of access to shipping information, and the ability to check out as guests. Implementing the recommended changes would likely improve user satisfaction, task completion speed, and overall user confidence on the site, making it more competitive and appealing in the industrial supply domain.