infor.com
concierge
infor u
developer portal
Posts
Categories
Groups
Hubs
Developer
Healthcare
Hospitality
Public Sector
CloudSuites
Aerospace & Defense (LN)
Automotive (LN)
Chemicals (M3)
Corporate (FSM/HRT)
Distribution (Sx.e/CSD)
Distribution Enterprise (M3)
Engineering & Construction (LN)
Fashion (M3)
Food & Beverage (M3)
Healthcare (FSM/HRT)
Industrial (Syteline/CSI)
Industrial Enterprise (LN)
Manufacturing (M3)
Public Sector (FSM/HRT)
Solutions
Supply Chain Management (SCM)
Human Capital Management (HCM)
Events
Groups
Your Groups
User Groups
Migrated Forums
FSM/HCM/S3 - Infor Lawson 10.x
HCM/S3 - Learning and Development
HCM/S3 - Global HR
HCM/S3 - Talent Acquisition
HR Service Delivery
Human Capital Management (HCM) - EMEA
Infor Configuration Management for Service Industries
Lawson - Business Intelligence
Lawson - Financials
Lawson - Human Resources
Lawson - Supply Chain
Lawson - Supply Chain Management
Lawson - Technology
MSCM on Landmark
About
Community News
Email Community Support
Home
Groups
LN ERP - Customer Community
Benchmarking application quality
Legacy Contributor
Hello,
I am wondering if anyone has done metrics tracking on the quality of the Baan or ERP LN application. We as a technical team can easily say the system us up and available or it is not, but how does one measure the quality of the application in a quantitive measure?
Looking forward to hearing from anyone who has attempted this.
Thanks!
Find more posts tagged with
Comments
joatzin
Hi. We track a variety of things with our ERP LN incidents. Basically, we measure the quality of the solutions/responsiveness to the incidents and other questions we may have.
Specifically, we measure the number of incidents that are waiting for solutions, and their age. How "old" are the incidents, and also how long did it take to get a solution.
We define aging measures by the Severity level. We exclude severity 5s because they're enhancements. For Severity 2, 3 and 4s, we have ranges of time (for each Severity) that we measure "success". We use color ratings (gold, silver, bronze, yellow and red). For example, if a solution for a Sev 2 incident is provided to us within two weeks of submitting it in Xtreme, then we consider that a Gold rating. If the solution doesn't come for 3 months, then we consider that Red. Then the ranges are different for Sev 3 and 4. The "rating" time frames are driven from our internal IT schedules to implement integrated solutions. Your company may be perfectly satisfied with getting a Severity 2 solution in a month or two and could consider that an A+ for Quality.
We also keep track of whether or not the solution passes our internal validation testing and if it doesn't we track the "returns" and the fact that we get a "revised" solution.
Important Links
Community Hubs
Discussion Forums
Groups
Community News
Popular Tags
CPQ: Ask a Colleague
FAQs, How-To, and Best Practices
Idea
Infor OS Portal
UI Design
CPQ: Tips and Tricks
Infor Homepages
Widget Development
Infor EPM Migration
Infor Ming.le