Improving code assessment experience for .NET modernization
Intelligent .NET repository assessment and transformation planning for modernizing legacy applications to cross-platform .NET
Summary
I led the UX design for an AI-powered .NET code assessment capability with complexity, compatibility, and dependency analysis to help customers gain deep insight about their resources so they can gain confidence in their modernization planning.
Impact
By enabling developers to do parallel assessment of 100+ repositories and providing actionable insights, the code assessment capability has increased developer satisfaction and reduced modernization planning time from weeks to days.
Problem
Enterprise customers want comprehensive code assessments to make confident transformation plans, but the current experience doesn't provide nearly enough information for that.
Before: Assessment process is hidden in worklog, many customers were confused about what's happening
Before: Limited visibility into assessment results and data accessibility issues
Transparency
The current experience makes the transformation plan without showing the "why"
Discoverability
Customers need help to get actionable insights from the rich technical data about their repo, which is difficult to discover and navigate
Uncertainty
Current transformation plans are recommended only based on limited metadata such as last modified date and basic repo info, leaving customers uncertain about actual complexity and dependencies
Manual
Current repository assessment requires weeks of manual work using downloaded spreadsheets, which is error-prone and labor intensive
Challenge
There were several UX challenges I needed to resolve during each step of the assessment process:
Code Assessment User Flow
Discover Repo
Select Repos
Start Assessment
View Results
Process
Based on this research, I designed the assessment workflow with three core design principles:
- Transparency: Show assessment progress and status in real-time
- Accessibility: Make assessment data queryable through natural language and filterable through UI
Report structure
Teams need to assess repository dependencies at multiple levels (repository, project, package).
In-progress results
Worked with dev to propose in-progress pattern for long-running tasks.
Flow iterations
User testing sessions with enterprise customers to validate design decisions and gather feedback, followed by refinement based on insights to optimize the experience for clarity, efficiency, and accessibility.
Final Result
I designed a multi-phase assessment and planning workflow that gives users control, transparency, and actionable insights:
1. Bulk Selection Capabilities
Bulk selection capabilities allow assessing 100+ repositories in parallel without manual one-by-one selection, eliminating UI friction and enabling teams to assess their entire portfolio efficiently.
2. In-Progress Assessment Viewing
Real-time status updates reduce anxiety and give users control over the assessment process. Users can monitor progress as repositories are being analyzed.
3. Multi-Level Reporting & Chat Query
Assessment results are presented at job and repository levels, with detailed project information available in downloadable CSV files. Users can ask questions through chat like "Which repos have incompatible packages?" allowing both technical teams and stakeholders to understand assessment findings.
4. Visual Dependency Groups
Visual dependency maps and intelligent grouping help teams understand relationships between repositories and dependencies, enabling better transformation planning and reducing complexity.
Outcome
Since launch in August 2025, we have heard improved customer feedback on the assessment experience. Customers such as Thomson Reuters are happy with the comprehensiveness of the assessment, providing them with the actionable insights to make proper modernization planning happen.
Thanks for reading!
Other Projects