Back Home

Redesigned a cluttered dashboard into a clear decision tool, improving manager efficiency and data trust

July 09, 2021

4 min read

Last Updated: September 4, 2025

Over time, the App Pedidos dashboard had become a maze of information. New metrics were added without structure, and every number fought for attention. What had once been a quick way to read performance became a confusing wall of numbers.

The people using it daily, from managers to marketing and sales leads, no longer had a clear path to insight. They needed simplicity to make better decisions, not more information to sift through.

Overall view of the old dashboard with mixed visual hierarchy

Detailed financial and order metrics screen from the old version

Indicators screen showing raw ungrouped city and sales data before redesign

My first action was to go through that experience myself. I opened the app as a new user and was immediately overwhelmed by text with the same size, color, and hierarchy.

Everything seemed important, so nothing actually was. The first step was to dissect the structure to understand what lived where. I analyzed the user flow of the dashboard, from the main entry point to every subpage it unlocked, until I had a complete map.

A map clarifies a lot. The most important takeaway was understanding how the engineers before me had classified information, how many screens separated each set of content, and where interaction points connected. But one question remained: which information mattered most at each moment?

Flow mapping visualization — breakdown of all screens and connections between dashboard sections

The Turning Point

I realized I couldn’t fix this with layout tweaks. I needed to understand what information actually mattered. So, I asked the users.

The core users were product managers, the head of marketing, and the head of sales. They were using the app daily to track performance and plan next steps. I gathered all the data points from the dashboard and turned them into post-its in FigJam. Each one looked identical to avoid bias. Then I asked everyone to group them by importance: most importantimportant, and less important.

Group sorting exercise on FigJam — collaborative research session mapping and prioritizing information

Sales and orders quickly appeared in the top group, while online orders through the website were often ranked lower. Still, I wanted to test whether changing the activity structure would reveal different results. So I ran a second round.

Listening Before Drawing

In the second session, instead of grouping, I asked them to rank each item individually on a scale. This simple change made patterns much clearer. Every item received a numeric position: 1 point for the most important, 2 for the next, and so on. Summing those scores and calculating an average created a single, objective ranking.

Priority ranking visualization — visual summary of the ranking test highlighting top priorities

When I cross-checked this ranking with the earlier group results, the data aligned perfectly. The exercise made priorities visible, turning assumptions into clarity.

This made a great difference. At first, it looked like just a bunch of yellow post-its, but with the magic of color coding, we got a clear vision of what was actually happening.

Color-coded ranking visualization — grouped post-its showing the priority ranking by each participant

Now it was turning clearer. I took these and assigned a point for each position in the rank, starting from the left with 1 point, then 2, 3, and so on. By summing the position points, we got a total that revealed the average score and created a unique ranking based on the results.

Final resulting rank visualization — consolidated priority chart displaying the average ranking and resulting data hierarchy

Lastly, I cross-checked the overall ranking with their respective groupings from the first round. This final step made some inconsistencies stand out and confirmed others, closing the loop on the research before moving to design.

Grouped results from Techlead, CEO, CVO, and CFO highlighting how priorities shifted across leadership roles

At that moment, I realized the research had reached its purpose. The inconsistencies that once seemed random now showed clear reasons behind user behavior. The ranking exercise explained why some data was ignored and others demanded attention. Everything connected into a clear story of what users prioritized and how they valued the information.

Gathering Inspiration

With the hierarchy clear, I began studying other dashboards. I collected several competitor dashboards to use as reference points. Studying how other products displayed similar data helped me identify what worked well and what felt confusing. These examples guided layout decisions and visual balance before any sketches began.

With the hierarchy clear, I began studying other dashboards. I collected several competitor dashboards to use as reference points. Studying how other products displayed similar data helped me identify what worked well and what felt confusing. These examples guided layout decisions and visual balance before any sketches began.

Competitor's aiqfome dashboard with high data density and card grouping

Competitor's aiqfome simplified dashboard showing clear metric hierarchy and visual contrast

Building Something That Fits

With those insights, I began sketching. Once the hierarchy was clear, I started experimenting with the information on the screen. I looked at competitor dashboards for reference and began sketching card-based structures that highlighted the most relevant indicators first.

Early wireframe exploring basic card structure

Refined version showing metric hierarchy and value contrast

Test layout experimenting with vertical grouping

Alternate visual density layout for multi-card comparison

It was clean, but during review, I found out that TitaniumSDK, the framework behind the app, didn’t support multiple columns. I adapted the layout into a single-column design where each card title became clickable. This small change saved space and made the screen easier to navigate while staying within technical limits.

City indicator list with full metrics

Comparison layout with adjusted typography

Final color-calibrated version showing data hierarchy refinement

We went through several iterations before landing on the final version. It met the technical constraints, respected the visual hierarchy, and made information easier to scan. Small adjustments like clickable titles ended up having a big impact, and the screen finally felt dense but readable.

A Clearer Way to See the Business

Statement Screens

Statement view 1 — financial performance overview with gross and net commission

Statement view 2 — side-by-side comparison of revenue and orders with percentage trends

Indicators: All Cities Screens

Indicators view 1 — ranking of cities by revenue, showing growth and performance metrics

Indicators view 2 — detailed comparison between current and previous month performance by city

Indicators: All Stores Screens

Indicators view 1 — store-level performance breakdown showing revenue and commission

Indicators view 2 — comparison of store data with previous month

When the new dashboard went live, everything started to move faster. Managers could identify key metrics within seconds. The interface that used to feel heavy now felt intuitive. Meetings that dragged on turned into quick, focused discussions.

The clarity achieved here also shaped how other internal tools were designed, aligning teams across the company.

What They Said After Launch

After launch, managers shared how much the redesign changed their workflow. One said it was the first time they could open the app and instantly understand the company’s status without calling the data team.

Another mentioned that meetings became shorter because everyone was finally looking at the same numbers. The head of marketing said it was the first time performance and sales data felt connected.

The feedback confirmed that the redesign didn’t just look better, it helped teams work better.

Results in Numbers

  • +30% (average meeting time reduced from 40 to 28 minutes) faster decision-making during management meetings (based on internal survey)

  • 3x (from 1 to 3 trends per week) quicker identification of sales trends compared to the previous version

  • 1.5x (from 4 to 6 weekly active users) increase in dashboard usage frequency among marketing and sales teams

  • 0 reports of confusion or navigation issues during post-launch feedback (tested for 28 days)

Let’s build something together.