Skip to content

Umaku Dashboard

The Umaku Dashboard provides a unified, AI-driven view of project health, progress, and delivery quality.

It combines traditional project metrics with AI agent feedback, giving teams real-time visibility into execution, quality, and risk.

This dashboard helps project managers, engineers, and stakeholders quickly understand:

  • How the project is progressing
  • The quality of delivered work
  • Sprint performance and alignment
  • Open risks, bugs, and descoping requests

The Umaku Dashboard is divided into multiple sections, each focusing on a specific aspect of project execution.

Umaku dashboard.png

The Project Progress bar shows the overall completion percentage of the project.

  • Calculated based on completed work items, sprint progress, and milestones
  • Gives a quick, high-level indicator of project health
  • Helps stakeholders understand how close the project is to delivery

The AI Scores Trend visualizes performance insights generated by AI agents across completed sprints.

The dashboard tracks four core AI evaluation dimensions:

  1. Sprint Inclusion
    • Measures how well committed sprint items are actually delivered
    • Highlights scope stability and planning accuracy
  2. Code Quality
    • Evaluates maintainability, readability, and correctness of submitted code
    • Based on AI code analysis and review feedback
  3. DevOps Compliance
    • Assesses adherence to CI/CD practices, deployment standards, and operational readiness
    • Identifies gaps in automation or process compliance
  4. Bugs Finder
    • Tracks potential defects detected by AI
    • Includes logical issues, edge cases, and regression risks
  • Each line represents a completed sprint
  • Scores are normalized as percentages
  • Trends help teams identify improvement or degradation over time

The Status Overview provides a snapshot of all work items by status.

Typical statuses include:

  • To Do
  • Doing
  • Done
  • Review
  • AI Agent (items currently being analyzed by AI)

This section helps teams quickly assess workload distribution and delivery progress.

The Descoping Requests section shows tasks flagged by AI for potential issues.

AI identifies tasks that may have:

  • Code quality problems
  • Logical errors
  • Other issues impacting delivery

Users can review each request and:

  • Accept the descope – move the task back to the backlog
  • Reject the descope – remove the task from the sprint

This ensures that tasks with potential problems are either reconsidered or removed, helping maintain sprint and project quality.

This section summarizes the active sprint and its intent.

  • Sprint title and duration
  • Overall sprint completion percentage
  • Sprint goal
  • Detailed sprint objectives
  • Aligns the team around current priorities
  • Provides context for AI evaluations
  • Helps reviewers understand whether work aligns with sprint goals

The Risks section highlights known risks that may impact delivery.

  • Includes AI-identified and manually added risks
  • May contain mitigation notes or dependencies
  • Helps teams proactively address issues before they escalate

The Umaku Dashboard combines execution metrics and AI intelligence into a single view, enabling teams to:

  • Detect issues earlier
  • Improve code and delivery quality
  • Maintain sprint alignment
  • Reduce scope creep
  • Make data-driven decisions faster

Use these tutorials to understand and work with the Umaku Dashboard. Each guide walks you through a specific part of the dashboard and explains how to interpret the data shown.

Learn how to quickly understand the overall progress of a project.

    1. Navigate to Projects.
    2. Select the project you want to view.
    3. The project opens on the Overview tab, showing the Umaku Dashboard.

    ScreenRecording2025-12-28at1.06.34PM-ezgif.com-video-to-gif-converter.gif

    1. At the top of the dashboard, find the Project Progress bar.

    project progress.png 2. Review the percentage shown on the right side.

Learn how to read AI-generated performance insights across sprints.

    1. Scroll down to the AI Scores Trend section on the dashboard.
    2. The chart displays AI evaluation scores across completed sprints.

    ai scores.png

  1. The chart includes four evaluation dimensions:

    • Sprint Inclusion – Measures sprint commitment vs. delivery
    • Code Quality – Evaluates code maintainability and correctness
    • DevOps Compliance – Assesses CI/CD and operational standards
    • Bugs Finder – Detects potential defects and regressions

Learn how to view the distribution of work items by status.

    1. Scroll to the Status Overview section.
    2. Review the chart showing all work items grouped by status.

    status overview.png

    • To Do – Items not yet started
    • Doing – Items currently in progress
    • Done – Completed items
    • AI Agent – Items under AI analysis

Learn how to review tasks flagged by AI for potential removal or deprioritization.

    1. Scroll to the Descoping Requests section.
    2. Review the list of AI-flagged tasks.
    1. For each request, review the reason provided by the AI.
    2. Choose to:
    • Accept the descope and moved to the backlog
    • Reject it and removed from descoped

    Screenshot 2025-12-30 at 10.28.54 AM.png

Review Current Sprint Goals and Objectives

Section titled “Review Current Sprint Goals and Objectives”

Learn how to understand the focus of the active sprint.

    1. Scroll to Current Sprint Goal and Objectives.
    2. Review the sprint title, duration, and completion percentage.
    • Sprint goal
    • Sprint objectives
    • Key focus areas (features, improvements, bugs)
    1. Scroll to the Risks section at the bottom of the dashboard.
    2. Review listed risks and mitigation notes.

    Screenshot 2025-12-30 at 10.39.26 AM.png