2025/02/18
Share
Upgrade of customer management system iPlat
Project Overview
We conducted a comprehensive verification of the existing business system (hereafter “the System”) to ensure that functionality, display quality, and usability are maintained when migrating from Windows 10 to Windows 11. For every screen and function, we collected evidence on both Chrome and Microsoft Edge and compared the results with those on Windows 10. For any discrepancies identified, we isolated the root cause, proposed corrective options, and performed re-verification.
Technology Stack & Development Tools
- Programming Languages: Java, JavaScript
- Full-stack Frameworks: Struts, Spring, iBATIS
- Database: Oracle
- Task Management Tool: Redmine
- Communication Tool: Slack
Client Challenges
Risks from End of Support
The System has been built and operated on Windows 10. As the OS approaches end of support, the availability of security patches and optimizations becomes uncertain, raising concerns about potential vulnerabilities and performance degradation. A timely basis for migration decisions was required.
Compatibility Risks During Upgrade
It was unclear whether existing functions would behave consistently after migrating to Windows 11. Along the user operation flow—navigation, display, data entry, processing, and output—unexpected defects or behavioral differences could occur.
Need for Comprehensive Quality Assurance
Rather than ad-hoc checks, a full review of all screens and functions was required. Results between Windows 10 and Windows 11 had to be compared to make differences explicit, with evidence captured for each item to support subsequent analysis and explanation.
Client Requirements
- Verify all functions on Windows 11 and confirm them on both Chrome/Edge.
- Compare results between Windows 11 and Windows 10, recording evidence per screen and function.
- Investigate any defects found during verification and propose corrective measures.
- Report immediately if issues arise during the verification process.
Our Proposal & Approach
Define Test Perspectives & Standardize Templates
Starting from the System’s screen list, we inventoried operations (navigation/display/input items) and screen elements (tabs, parent–child relations, components, buttons), then set perspectives such as “navigable,” “displayable,” “retrievable,” “enterable/selectable,” and “clickable.” These were consolidated into a reusable common template to minimize omissions and duplication.
Capture Evidence Across Two Environments & Cross-check Results
We executed the same perspectives on both Windows 10 and Windows 11, collecting evidence per screen/function. Results were also recorded on Chrome and Edge to isolate differences by the OS × browser combination. When discrepancies were found, we clarified reproduction steps and occurrence conditions.
Focus on User Operations & Display Quality
We validated correctness against expected behavior for user operations (navigation, display, input, processing, output). In parallel, we checked that display aspects on Windows 11—typography, component rendering, and screen layout—were equivalent to Windows 10 to prevent UX degradation.
Cycle of Early Sharing, Root-cause Analysis, and Re-verification
Upon detecting differences or defects, we reported immediately, isolated causes, and proposed fixes. After applying changes, we re-verified using the same perspectives to confirm resolution, iterating in small batches to control impact and ensure stability.