WorkMark - Making Invisible Work Visible

A performance intelligence platform that transforms employee contributions into clear, evidence-based recognition.

Region

Hyderabad, Telangana

Year

2026

Product Type

AI-powered B2B SaaS Platform

Industry

HR Tech × Enterprise Productivity × AI

The project itself :

Project Overview

In modern organizations, employees work across multiple tools, teams, and timelines, but much of their real effort remains invisible. Performance is often judged by memory, visibility, and self-reporting rather than evidence.
WorkMark reimagines performance evaluation by transforming everyday work into structured evidence and a single, transparent performance story for both employees and managers.

Problem:

As organizations grow, managers struggle to see the full picture of employee contributions, while employees feel their efforts are overlooked. Important work like collaboration, support, and continuous effort gets lost across tools and conversations.
This creates stress for employees, uncertainty for managers, and performance decisions that feel subjective, inconsistent, and unfair.

Goal:

The goal of WorkMark is to make invisible work visible and performance fair.
By capturing real work signals and converting them into clear insights and a single performance health score, WorkMark empowers employees to feel recognized and helps managers make confident, evidence-based decisions.

My role:

Product Designer (End-to-End), supporting GenAI workflow integration.

Responsibilities:
  • Defined problem & scope

  • Designed AI trust flow,

  • Done secondary research,

  • Defined system flow, User flow.

  • Defined human decision states,

  • Documented design rationale.

strategic diagnosis:

Problem Framing

Modern organizations rely on structured performance reviews to drive compensation, promotions, and retention decisions. However, everyday work contributions remain fragmented across collaboration tools, leaving decision-makers dependent on narrative memory rather than structured evidence.

Impacted Business KPI's :

Instead of immediately redesigning the interface, I adopted a hypothesis-driven approach to validate the root causes behind governance and workflow breakdowns.

This ensured that research efforts were focused, testable, and aligned with measurable business outcomes.

If contribution signals from distributed tools are passively aggregated and structured into contextual summaries, then performance evaluation efficiency and calibration fairness will improve without increasing governance risk.

Hypothesis :

If:

Structured evidence availability increases

Then:

• Review preparation time decreases

• Calibration override frequency decreases

• Perceived evaluation fairness increases

Before initiating deep research, I aligned with stakeholders on what long-term success would mean for the platform.
This ensured that hypothesis validation and solution exploration remained aligned with measurable product outcomes.

Background insight:

WorkMark

Making Invisible Work Visible in Enterprise Performance Systems. A performance intelligence layer that transforms fragmented activity signals into structured, decision-ready evidence for fair and scalable performance evaluation.

Background insight:

Historical Context

Traditionally, employee performance was judged through periodic reviews and manager memory, when work was visible and teams were small. But as work moved across digital tools and distributed teams, performance became harder to track, making real contributions increasingly invisible. Calibration is subjective.

Work happens every day.

Recognition happens occasionally.

The Hidden Cost

Consequences

When work remains invisible, organizations lose productivity, trust, and talent, while employees lose motivation and recognition.

Source: Brandon Hall Group, Gallup engagement research, Gallup State of the Global Workplace 2024/2025 and Gallup State of the Global Workplace 2023 Report

focus boundaries

Scope

This project focuses on designing a minimal performance intelligence system that captures everyday work and translates it into a single, transparent performance health score.
It explores how employees, can experience fair, evidence-based performance visibility without complex systems where managers, and organizations experience can come in the future scope.

who we designed for:

Target Audience

Companies with 50-500 employees operating in the chaos zone too large for informal recognition, too small for enterprise HR systems.

The Sweet Spot of Complexity

Why Mid-Scale?

The recognition challenge varies dramatically by company size. Mid-scale organizations face a unique inflection point: Mid-scale companies face the highest performance ambiguity - too large for intuition, too small for advanced systems.

What’s Actually Happening

Recognition Reality

Is this really happening in organizations? The data tells a stark story.

Root Causes

Why this problem Exists?

The recognition gap isn't about individual failure, it's a systemic breakdown across
multiple dimensions. Understanding these root causes is essential to building an effective
solution.

Source: Zymplify Business Growth Research 2025, Harvard Business Review research, McKinsey productivity research via multiple citations

Decision Pressure

Manager Cognitive Overload

The expectation that managers can track, remember, and fairly evaluate all contributions across their team is fundamentally unrealistic. They're juggling their own IC work while trying to support their reports.

The percentages above indicate the relative cognitive and workload intensity faced by Managers, not literal statistical values.

Source : Zymplify Business Growth Research 2025, Harvard Business Review research, McKinsey productivity research via multiple citations

Expectation vs Reality

Recognition Gap

There is a massive mismatch between what employees need and what organizations provide. This gap directly impacts engagement, retention, and productivity.

Source: Nectar Employee Recognition Survey and Already cited above (High5Test aggregation)

Research Approach

How We Investigated the Problem

We followed a mixed-method approach combining:

  • Secondary research (industry insights & existing systems)

  • Quantitative research (surveys)

  • Qualitative research (user interviews)

This helped us validate the problem from both data and human perspectives.

Existing Ecosystem

Do Tools Exists?

Yes , organizations already use multiple tools to manage work, performance, and recognition.
But these tools operate in isolation and fail to capture the full picture of employee contributions.

Why Existing Tools Fail

The problem persists because current solutions operate in silos:

They operate in silos without integration

They depend on manual reporting and human memory.

They don't reduce bias, they may even amplify it.

They don't integrate evidence from where work happens.

They don't provide continuous, real-time insights.

Insight Gap

The Missing Layer

Between “where work happens” and “where performance is evaluated,” there is no system that converts everyday work into structured, unbiased evidence.

We call this gap the Performance Intelligence Layer.

Product Matrix

Measuring the Problem

Between “where work happens” and “where performance is evaluated,” there is no system that converts everyday work into structured, unbiased evidence.

We call this gap the Performance Intelligence Layer.

How WorkMark is Different?

Existing tools answer:
“What was done?”

WorkMark answers:
“What actually mattered?, It creates evidence, not just data.”

Google Survey :

All about the user

User Research

Tools exist → But they fail → So we studied real users → To validate the problem → With data and stories.

Even though many tools exist, we wanted to understand:

  • Is the problem real?

  • Who feels it the most?

  • How severe is it?

  • Where exactly does it occur?

So we conducted qualitative and quantitative research with mid-scale organizations in India.

How WorkMark is Different?

Existing tools answer:
“What was done?”

WorkMark answers:
“What actually mattered?, It creates evidence, not just data.”

All about the user

User Research

Tools exist → But they fail → So we studied real users → To validate the problem → With data and stories.

Even though many tools exist, we wanted to understand:

  • Is the problem real?

  • Who feels it the most?

  • How severe is it?

  • Where exactly does it occur?

So we conducted qualitative and quantitative research with mid-scale organizations in India.

How WorkMark is Different?

Existing tools answer:
“What was done?”

WorkMark answers:
“What actually mattered?, It creates evidence, not just data.”

Quantitative Research

Data Patterns

We conducted surveys and data analysis to quantify how performance visibility, recognition, and evaluation actually function in mid-scale organizations. The results revealed significant gaps between real work and recorded performance data.

Responses received:

survey analysis revealed the specific types of employee contributions that consistently go unnoticed, highlighting patterns of invisible work across roles.

Qualiitative Research

User interviews

We conducted in-depth interviews to understand real employee experiences, emotions, and hidden challenges behind performance visibility. We ensured data quality by collecting responses from diverse roles, organization sizes, and tool ecosystems relevant to mid-scale companies.

Who we spoke to → Why → What we asked → What we heard → What it means → What we derived.

Whom do we talk to?



🎯 Total participants: 15 users



Roles: Employees, Managers, HR



Org type: Mid-scale Product/SaaS companies



Experience range: 1–18 years

We conducted semi-structured interviews to explore how performance is tracked, recognized, and evaluated in daily work. Questions focused on visibility, recognition, stress, and fairness.

Interview response sheet:

Few Screenshots of User Interviews

I introduced the XSOCK framework to systematically translate scattered qualitative interview insights into structured patterns, root causes, and actionable design opportunities, ensuring that design decisions were grounded in real user behavior rather than assumptions.

Regional Insight:

Behavioral Archetypes

WorkMark is a Performance Intelligence Layer that captures everyday work from tools and manual inputs, converts it into structured evidence, and translates it into a transparent performance health score.

Design Direction:

Final Concept Definition

WorkMark is a Performance Intelligence Layer that captures everyday work from tools and manual inputs, converts it into structured evidence, and translates it into a transparent performance health score.

How might we?

Turning problem statements into opportunity directions.

Based on our research insights, we framed the following “How Might We” questions to translate user pain points into actionable design opportunities.

1

How might we make invisible work visible without burdening employees?

2

How might we help managers evaluate performance using evidence instead of memory?

3

How might we capture everyday contributions automatically?

4

How might we make performance visibility feel motivating, not stressful?

5

How might we connect work execution tools with performance evaluation?

6

How might we design AI as a supportive coach, not a judge?

Insight → Design Mapping

How research findings directly informed design decisions


The Solution

Turning problem statements into opportunity directions.

WorkMark is a Performance Intelligence Platform that captures everyday employee work, converts it into structured evidence, and translates it into a transparent performance health score, enabling fair, continuous, and data-driven performance evaluation for employees and managers.

1

Work Capture Engine

Logs work from manual inputs and integrated tools.

2

Evidence Intelligence Layer

Converts activities into structured performance evidence.

3

Performance Health Score

Creates a single, explainable performance score.

4

Continuous Feedback & Visibility

Enables ongoing feedback and AI insights.

From scattered work to structured evidence , WorkMark makes performance Visible, measurable.

System Flow

How it works

I defined the system flow to show how data from tools, AI processing, and performance logic work together behind the scenes to generate meaningful insights and the performance score.

User flow

User journey

I designed the user flow to map how employees and managers move through WorkMark step by step, ensuring every action feels logical, simple, and aligned with their real-world workflow.

Information Architechture

User journey

I designed the user flow to map how employees and managers move through WorkMark step by step, ensuring every action feels logical, simple, and aligned with their real-world workflow.

Design Approach

Starting the design

The visual design focuses on reducing cognitive load and conveying trust in AI decisions through calm layouts, clear hierarchy, and semantic visual cues.

what changed :

Outcome

WorkMark reframed performance from subjective memory-based evaluation to evidence-driven visibility, making everyday contributions measurable and fair.


The project demonstrated how AI and design can transform invisible work into meaningful, explainable performance insights for employees and managers.

Takeaways

A civic-tech identity built for clarity, trust, and accessibility.

Designing WorkMark revealed that the real challenge in performance systems is not measurement but visibility, trust, and fairness.

What I Learned

1

Performance is a perception problem before it is a data problem.

2

Invisible work carries significant organizational value.

3

AI must be explainable to build trust in enterprise systems.

4

Mid-scale teams struggle most with fragmented performance data.

5

Design systems can shape organizational culture, not just interfaces.

Next Steps

1

Validate the system with real mid-scale teams and managers.

2

Refine the performance scoring model with real-world data.

3

Expand AI insights into predictive career and growth signals.

4

Prototype a manager-facing version for team-level decision-making.

This project demonstrates how invisible work can be transformed into transparent, evidence-driven performance using design and AI.

Let's Create Something
Meaningful Together

Whether you're shaping a product, solving a system problem, or exploring an idea , I'd love to collaborate.


Follow me on Other Channels:

Let's Create Something
Meaningful Together

Whether you're shaping a product, solving a system problem, or exploring an idea , I'd love to collaborate.


Follow me on Other Channels:

Let's Create Something
Meaningful Together

Whether you're shaping a product, solving a system problem, or exploring an idea , I'd love to collaborate.

Follow me on Other Channels: