3130 model validation: applying audits and metrics to uml richard c. gronback sr. product manager,...
TRANSCRIPT
3130Model Validation:
Applying Audits and Metrics to UML
Richard C. GronbackSr. Product Manager, Together Products
Borland Software Corporation
Agenda
Audits and Metrics Defined
Audits in UML– General Diagramming– Diagram-specific
Metrics in UML– General Diagramming– Diagram-specific
Conclusion
UML Constraints
• The UML Specification is mostly a description of a MOF metamodel, including constraints defined in OCL– It includes some notation as well
• Ideally, these constraints are manifested in tools to prevent you from violating these constraints– This is not always the case
UML Constraint Example
• Consider these constraints for an Actor found in the UseCases package:-- An actor can only have associations to use
cases, components and classes. Furthermore these associations must be binary.
self.ownedAttribute->forAll ( a |(a.association->notEmpty()) implies((a.association.memberEnd.size() = 2) and(a.opposite.class.oclIsKindOf(UseCase) or(a.opposite.class.oclIsKindOf(Class) and not
a.opposite.class.oclIsKindOf(Behavior))))
-- An actor must have a name.name->notEmpty()
Beyond UML Constraints
• Conforming to UML constraints is the first step to a well-formed model
• To improve readability, consistency, usage, generative potential, etc:– Attend training– Manual inspections– Utilize static analysis tools
Audits
What are Audits?– Checks for conformance to standard or
user-defined style, maintenance, and robustness guidelines.
Motivation– Improved readability– Improved standards-conformance– Increase generator effectiveness– Reveal subtle errors bugs
Audits
Knowing what to look for…– Reading
• Books • UML Specifications• Trade articles
– Mentors• Senior modelers in your organization• Resident “language lawyers”• Outside consultants
– Experience• Takes time to develop depth and breadth
– Tools
Metrics
Quantifying Model Characteristics
Definition:– Measurements that allow for the analysis of a
model and design are formulated as metrics– Technically, a metric is a function with two
arguments, while a measure “is a numerical value for an attribute assessed for magnitude against an agreed scale”
– (Henderson-Sellers, 42)
Metrics
Knowing what to look for…– Reading
• Software metrics books• OO design books• GoF’s Design Patterns
– Mentors• Senior developers in your organization• Outside consultants
– Experience• Takes time to develop depth and breadth• Firm grasp of OO techniques
– Tools
Metrics
Metrics are not a “silver bullet”
Metrics are improving, albeit slowly…
To benefit from metrics, you must understand:– Object-Oriented design concepts– What metrics are measuring– How to interpret the results
As with audits, automation is key
Benefits of Automation
Deep and precise parsing of modelSaves time and budgetSeparation of concerns:
– Computers excel at low-level inspection and counting
• People do not
– People excel at business logic and design inspection
• Computers do not
Refactoring
Martin Fowler defines refactoring as:– “…the process of changing a software
system in such a way that it does not alter the external behavior of the code yet improves its internal structure.”
Why Refactor?– To improve design, understandability
and code quality– To decrease the cost of maintenance
Software Remodeling
• “Bad smells” in source can be identified for refactoring using audits and metrics
• Similarly, static analysis of UML can be used to identify the need to “remodel” in a similar fashion
Model Audits and Metrics
A Starter Catalog
Sample Audits• Avoid Aggregation, Favor Composition
• Avoid Dangling Model Elements
• Always Indicate Multiplicity
• Always Indicate Navigability
• Avoid Multiplicities Involving Max and Mins
• Avoid * Multiplicity
• Always Name Associations
• Avoid Using Dependencies
• Do not Overlap Guards
• Do not Use Disjoint Guards
• Identifier Conflicts with Keyword
• Indicate Role Name on Association Ends
• Indicate Role Names on Recursive Associations
• Lines Should Not Cross
• Naming Conventions
• Never Place Guard on Initial Transition• Provide Comment for OCL Constraints• Use Plural Names on Association Ends with
Multiplicity > 1• Avoid Generalization Between Use Cases• Avoid Unassociated Actors• Avoid <<uses>>, <<includes>>, and
<<extends>>• Avoid Weak Verbs at Beginning of Use Case• Avoid Association Classes• Abstract Class Declaration• Avoid Cyclic Dependencies Between Packages• Avoid N-ary Associations• Avoid Qualifiers• Always Specify Type on Attributes and
Parameters• Class Should be Interface
Sample Audits• Conflict With System Class• Do not Model Elements of Implemented
Interfaces• Do not Model Scaffolding Code• Do not Name Associations that have
Association Classes• Hiding Inherited Attribute• Hiding Inherited Static Method• List Static Operations/Attributes Before
Instance Operations/Attributes• Overriding Non-abstract Method with Abstract
Method• Subclasses have the Same Member• Use Singular Names for Classes• Avoid Modeling Destruction• Avoid Modeling Return Arrows• Avoid “Black Hole” States
• Avoid “Miracle” States
• Avoid Recursive Transitions With no Entry or Exit Actions
• Avoid “Black Hole” Activites
• Avoid “Miracle” Activities
• All Transitions Existing a Decision Must Have Guards
• Forks Should Have Only One Entry Transition
• Joins Should Have Only One Exit Transition
• Components Should only Depend on Interfaces
General Diagram Audits
Avoid Dangling Model Elements (ADME)• Model elements not appearing on a diagram are
hard to track and may confuse generation facilities.
A
Model element not found on any diagram
General Diagram Audits
Do not Overlap Guards (DOG)• Guard conditions should cover all possible
outcomes, but without overlapping.
A
Overlapping guard conditions on outgoing transitions
General Diagram Audits
Do not Use Disjoint Guards (DUDG)• Guard conditions should cover all possible
outcomes, without leaving gaps in range.
A
Disjoint guard conditions on outgoing transitions
General Diagram Audits
Provide Comment for OCL Constraints (PCOC)
• OCL can be difficult to read, so a comment should be provided to clarify.
A
OCL constraint should have comment
Use Case Diagram Audits
Avoid Generalization Between Use Cases (AGBUC)
• The use of generalization in use cases is rare and not well understood. Use <<extend>> or <<include>> relationships instead.
A Generalization relationships betweenuse cases should be avoided
Class Diagram Audits
Avoid Association Classes (AAC)• Association Classes can be decomposed into a
separate class that associates two others. These may confuse generators, or be decomposed anyway.
A
Association classes and n-aryassociations should be avoided
Class Diagram Audits
Class Should be Interface (CSI)• Abstract classes with only abstract methods and
static final fields should be declared as an interface.
A
An abstract class with only abstract methods andfinal static fields should be an interface.
Class Diagram Audits
Do not Model Elements of Implemented Interfaces (DMEII)
• To reduce clutter in a class representation, avoid displaying redundant elements.
A
Elements of implemented interfacesshould not be visible on same diagram
Class Diagram Audits
Subclasses have the Same Member (SSM)• If two or more classes of a class or interface
share a member, a “pull up” refactoring may be in order.
A
Subclasses contain similar member‘setNewValue:void’ – consider pullup method refactoring
Sequence Diagram Audits
Avoid Modeling Return Arrows (AMRA)• To reduce clutter on diagrams, the explicit
modeling of return arrows is discouraged.A
Return arrows tend to clutter sequence diagrams
State Diagram Audits
Avoid “Black Hole” States (ABHS)• Only End states should have an incoming
transition with no outgoing transition.
A
Only end states should have no outgoing transition
Sample Metrics• Number of Colors on Diagram• Number of Elements on Diagram• Number of Stereotypes on Diagram• Number of Outgoing Transitions• Number of Incoming Transitions• Depth of <<include>> Relationships• Depth of <<extend>> Relationships• Number of Extension Points• Depth of Inheritance Hierarchy• Number of Child Classes• Attribute Complexity• Attribute Hiding Factor• Attribute Inheritance Factor• Class Interface Width• Data Abstraction Coupling• Number of Levels• Cyclomatic Complexity• Number of Swimlanes• Number of System Boundaries
• Method Hiding Factor• Number Of Parameters• Number of Accessor Methods• Number Of Attributes• Number of Added Methods• Number of Classes• Number of Constructors• Number of Members• Number of Operations• Number of Overridden Methods• Number of Parameters• Number of Public Attributes• Package Size• Response For Class• Weighted Methods Per Class• Weight of Class• Number of Branches
General Diagram Metrics
Number of Colors on Diagram (NOCD)• More than 4 colors on a diagram will cause
readability problems.
M
More than four colors on a diagram decreases readability
Use Case Diagram Metrics
Depth Of <<include>> Relationships (DOIR)• A series of nested <<include>> relationships
indicates functional decomposition and decreases readability.
M
More than three levels of <<includes>>reduces readability
Class Diagram Metrics
Depth Of Inheritance Hierarchy (DOIH)• A measure of the inheritance chain from the root
to the measured class.
M Excessive depth of inheritance leads tobrittle design and maintainability issues
M
M
M
M
M
Evaluate the design, consideringpotential Extract Class, Pull Up, Push Down, etc. refactorings.
Also, consider replacing inheritance with composition where appropriate.
Class Diagram Metrics
Number Of Child Classes (NOCC)• The number of directly or indirectly derived
classes from the measured class.
M
Excessive child classes hurtsmaintainability and may indicate poor design Evaluate the design, considering
potential Extract Class refactoringor design with composition.
Class Diagram Metrics
Number Of Attributes (NOA)• The number of attributes owned by a class.
M
An excessive number of attributes may indicate aclass may need to be split
Evaluate the class to determineif its responsibilities and their dataelements can be moved using anExtract Class refactoring.
Activity Diagram Metrics
Number Of Outgoing Transitions (NOOT)• The number of transitions leaving a decision or
fork.
M
An excessive number of outgoing transitions from decisions or forks may be hard to understand and maintain
Evaluate the sequence to identifypotential decomposition of logic.
Activity Diagram Metrics
Cyclomatic Complexity (CC)• The number of decision points or forks in an
Activity graph.M An excessive number of decisions
or forks in a diagram may be hard to understand and maintain
Evaluate the sequence to identifypotential Extract Method-like Refactoring (Extract Sequence?).
Examples
What’s wrong with this picture?
Beyond the UML
What’s wrong with this BPMN Diagram?
Beyond the UML
Is ‘E’ intended to be part of the loop?
Summary
• In order to take advantage of MDA, more rigorous modeling is required
• Manual inspection and detection is inefficient and costly
• Automated analysis is available and proven in source code inspection
• Model level audits and metrics coming soon to a UML modeling tool near you…
Questions?
Thank you!
Thank You
3130Model Validation:
Applying Audits and Metrics to UML
Please fill out the speaker evaluation
You can contact me further at …[email protected]
References
[Coad99] P. Coad et al, Java Modeling in Color with UML: Enterprise Components and Process. Upper Saddle River, NJ: Prentice Hall PTR, 1999.
[Frankel03] D. Frankel, Model Driven Architecture: Applying MDA to Enterprise Computing. Indianapolis, IN: Wiley Publishing, 2003.
[Fowler99] M. Fowler, Refactoring: Improving the Design of Existing Code. Reading, MA: Addison-Wesley, 1999.
[Fowler04] M. Fowler, UML Distilled: A Brief Guide to the Standard Object Modeling Language, Third Edition. Boston, MA: Addison-Wesley, 2004.
[Ambler2003] S. Ambler, The Elements of UML Style. New York, NY: Cambridge University Press, 2003.
[TEC 6.3] Together Edition for Eclipse, version 6.3. See product documentation for further information regarding the audits and metrics references.