top of page

Preamble
To begin with, I must clarify that, in accordance with the non-disclosure agreement, I am unable to disclose specific details about the project. This includes the client’s name, the product, certain functionalities, the list of completed tasks, and any other sensitive information that could impact either the client’s business or reputation, or mine.

The client is a major international logistics company based in the United States. They offer comprehensive logistics services, covering everything from raw material extraction to transportation to factories and plants, both within the U.S. and globally.

This client operates through several divisions, each specializing in different aspects of logistics.

The division for which we carried out our work focuses on developing a Yard Management System (YMS).

🤖

What is Yard Management?

Yard Management Systems (YMS) handle various tasks, including:

  • Control of Vehicle Entry/Exit: Managing the movement of vehicles in and out of the yard.

  • Monitoring Inbound/Outbound Operations at the Dock: Overseeing the loading and unloading processes.

  • Providing Comprehensive Asset Information: Offering detailed data about assets in the yard.

  • Controlling Arrival Times, Idle Times, Docking, and Departure Times: Managing and tracking key time metrics related to vehicle operations.

  • Detailed Schedule Planning with Real-Time Adjustments: Facilitating precise scheduling with the ability to make real-time changes.

  • Collecting Additional Information about Assets, Drivers, and Trucks: Gathering extra details relevant to the yard’s operations.

  • Comparing Metrics, Calculating KPIs, and Generating Reports: Analyzing performance indicators and producing reports for better decision-making.

Problems

In this case study, I will focus exclusively on the Whiteboards module, without delving into the broader system or other solutions.

The client approached us due to several challenges: an outdated interface, overly complex visual design, cumbersome navigation, a lengthy onboarding process, and more modern solutions offered by competitors.

Our goal was to refresh the existing interface and user logic, recommend improvements to the visual style, streamline user workflows, and enhance the overall intuitiveness of the system.

Below is a screenshot illustrating the module before we began our work.

🤔

Let's take a closer look

Color coding is a powerful tool that helps users quickly locate or identify information with minimal effort. However, in the client’s solution, this benefit was undermined. Users had to create new color schemes to make information more noticeable, leading to a cluttered interface overwhelmed with visual noise. This added complexity made the already challenging navigation even more difficult.

Moreover, there were issues with the placement of information on the cards, a lack of a unified design language, and inconsistencies in iconography and spacing, among others.

These problems significantly impact the business and, when combined, represent a serious risk to the company’s market leadership that has been built over the years.

Research

Conducting product research was challenging due to the closed nature of our system, which limited direct communication with actual users.

We began by researching the project with a small group of participants, including fellow designers from other projects, the development team, management, and any available users. This approach helped us identify both strengths and weaknesses, understand their impact on the user experience, and determine areas for improvement.

From this process, we formulated several critical questions:

  • Where do problems occur, and what impact do they have on users?

  • What causes these issues, and can they be avoided?

  • How do these issues affect the business, and what improvements can be made?

  • What are the users’ pain points, why do they arise, and how can we address them?

Based on the gathered data, we developed a list of tasks and metrics to measure the system’s usability. These metrics included both subjective assessments of user perceptions and objective, data-driven evaluations.

😣

Research Findings

  1. Complex Navigation: Users experienced difficulties navigating the system, struggling to locate and effectively use navigation elements.
     

  2. Lack of a Unified Visual System: The absence of a consistent visual design, compounded by the use of multiple styles, led to users spending more time locating desired functions or settings.
     

  3. Complicated Management System: The process of creating, viewing, or editing tasks was spread across multiple screens. This fragmented approach decreased user efficiency, making the product’s logic less intuitive and harder to master.
     

  4. Inefficient Task Management: The creation of new tasks was divided into several disconnected steps, increasing the likelihood of errors and diminishing overall efficiency within the system.

The research findings confirmed what we had anticipated. We were already aware of the necessity to update the interface design and optimize the product’s logic to enhance usability and streamline user interactions.

Ideation

The research indicated that our primary task was to update the user interface and restructure the system’s logic where necessary. The top priority was to redesign the cards in the Whiteboards module.

Key changes we decided to implement in the card design include:

  • Visual Simplicity and Intuitiveness — Users should be able to quickly locate and identify the necessary card, understand its status, and interact with it without encountering visual clutter.
     

  • Support for Existing Features — The new cards must support all current functionalities and provide flexibility for future feature expansion.
     

  • Ease of Configuration — Modifying a card should be intuitive and straightforward for all users.

After aligning these ideas with the client’s team, we compiled a comprehensive list of information to be displayed and functions to be made accessible to users for effective interaction.

ℹ️

Information to Display

  • Dock Number

  • Current Dock Status (e.g., operational, occupied, blocked, damaged, etc.)

  • Time Indicator (displays the time spent at the dock)

  • Selective Indicators (e.g., checkboxes)

  • Vehicle Identifier — Number with a link to the vehicle’s detailed page and available operations

  • General Information about the Vehicle or Assets (with the option to highlight specific items)

  • Type of Truck or Asset

  • Information on the Next Truck in the Dock Queue (New function)

  • Information on Dock Reservation (New function)

⚙️

Functions

  • Dock operations (e.g., change status, add a comment, close, etc.)

  • Asset (Truck) operations at the dock

  • Operations for calling a truck to the dock (empty/full)

  • Queue management operations for the dock

Design

In designing the Whiteboards module, our goal was not only to modernize its appearance but also to create an interface that engages users on a sensory level, beyond just visual elements.

Based on our research, we identified that our typical user is a male aged 38-50, without specialized education, who has been using the existing system for years and is accustomed to its design and functionality.

For the visual design of the cards, we focused on updating only the elements that were underperforming, such as colors, icons, and highlights. At the same time, we preserved the familiar elements to facilitate users’ adaptation to the updated interface in this initial iteration.

Below is a snapshot of the card design evolution.

While working on the visual design of the cards, we engaged in an extensive process that included proposing and testing numerous hypotheses, creating multiple design variations, making and correcting mistakes, and adapting the solutions to meet both client requirements and usability standards.

The final version of the first iteration, displayed below, is the result of this comprehensive process.

🚀

Colored status bar

Colored Flag

Identification

Dock Actions

Asset Move Actions

Dock Number

Asset ID

Time on Dock

List of Asset Information Details

Queue Management and

Number of Assets in the queue

Request a Full Truck to a dock

Request an Empty truck to a dock

Queue Management Icon Identifier

Next Asset in the Queue

Highlighted Asset
Information

Created in alignment with the requirements, needs, and desires, it embodies a harmonious blend of minimalist design and rich information, essential for efficient system interaction.

Above are a few examples of the card design.

In the screenshot above, you can see the updated Whiteboards interface along with the new card designs.

spot.png

The queue management function we developed and implemented provides the client with substantial market advantages. It accelerates warehouse operations and ensures automatic notifications to drivers regarding when to bring the next truck to the dock. This is achieved by considering factors such as queue position, cargo priority, driver availability, and other relevant parameters.

This approach not only streamlines the process but also minimizes the impact of human error in managing vehicle movement requests to the gates.

Testing

After concluding the main design phase, the next step was to evaluate its effectiveness through usability testing. The objective of this testing was to verify that the new design truly meets user requirements, facilitates ease of use, and does not introduce additional barriers to interacting with the system.

We selected several usability testing methods for a comprehensive evaluation of the product:

🥼

"Lab" Testing

Participants: involved 10 users from our team, assigned roles as gate operators in the system.

Process: Each user performed typical tasks such as changing gate statuses, adding comments, calling trucks to the gates, etc. We observed their actions, recorded the time taken to complete tasks, and noted any difficulties encountered.

Results: 80% of users completed the tasks without significant issues. Some difficulties were encountered with configuring identification flags and performing gate operations, which required additional explanations.

👁️

Behavioural Data Analysis

Process: We utilized analytical tools to track user interactions with the interface, allowing us to observe the frequency of specific actions, identify the most-clicked elements, and pinpoint where users hesitated or navigated away.

Tools:

  • Google Analytics: Enabled us to monitor overall usage trends of the system, including the most frequently accessed pages and features.
     

  • Hotjar: Provided user session recordings and heatmaps, which helped us understand where users clicked, how they moved their mouse, and where they lingered the most. This offered valuable insights into their journey through the system.
     

  • Mixpanel: Facilitated granular analysis of user behavior by tracking specific actions and creating interaction funnels, enhancing our understanding of user pathways through the system.


Methods: We employed heatmaps to identify areas of highest activity, recorded user sessions for detailed behavioral analysis, and created funnels to analyze user paths through the system.


Results: Users most frequently performed tasks related to calling trucks to the dock and releasing them, as well as managing the queue for the current shift using the queue manager. Although color-coded indicators in the form of flags were used less often, users did not consider this feature unnecessary.

Results

The work on updating the Whiteboard module spanned several months and involved stages of research, prototyping, and final development. During this time, we not only modernized the card design but also expanded the module’s functionality to enhance its capabilities while preserving the existing user experience.

After implementing the new features, we conducted system testing, which included user interviews. We established three control groups for the testing:

  • Group A: New users performing tasks with the old design.

  • Group B: New users performing tasks with the updated design.

  • Group C: Experienced users of the old module given access to the updated design.

 

Results:

  • Group B showed a 16% improvement in efficiency compared to Group A, which used the old design.
    Group C demonstrated the most significant improvement, with a 23% increase in efficiency, indicating reduced task completion time and fewer errors.

During the interviews, users noted that the newly introduced functionality required additional time to learn, leading to errors and decreased efficiency in the test scenarios.

Overall, the updated design was well-received. However, a few respondents expressed reservations, citing the need for extra time to become familiar with the new features as the primary reason for their mixed feedback.

bottom of page