Phase 3 Visual Studio Agile Deployment Assessment

Visual Studio Agile Deployment Assessment
Delivery Guide
Developer Tools Deployment Planning Services
Page 1
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Table of Contents
1 Purpose of the Delivery Guide ............................................................................................ 3
1.1 Engagement Agenda........................................................................................................................... 3
2 Engagement Overview ....................................................................................................... 5
2.1 Phase 1: Pre-Engagement Activities (Day 0)....................................................................................... 5
2.2 Phase 2: Kick-off and Presentation (Day 1) ........................................................................................ 6
2.3 Phase 3: Discovery Workshops and Interviews (Day 1-2) .................................................................. 6
2.4 Phase 4: Agile Development Assessment Deliverable Deployment (Day 3) ...................................... 7
2.5 Phase 5: Plan and Resources Delivery (Day 3).................................................................................... 8
Appendix A - ALM Evaluation Guidance ................................................................................. 9
Executive Overview .................................................................................................................................. 9
Vision Statement ...................................................................................................................................... 9
Scope of Engagement ............................................................................................................................... 9
Out of Scope ........................................................................................................................................... 10
Risk Assessment...................................................................................................................................... 11
Potential Customer Views ...................................................................................................................... 11
Overcoming Interview Obstacles............................................................................................................ 12
Evaluation Benefits ................................................................................................................................. 14
Understanding the Customer’s Situation ............................................................................................... 15
Appendix B - ALM Maturity Level Definitions....................................................................... 16
Appendix C - Maturity Model Practice Areas........................................................................ 18
Page 2
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
1 Purpose of the Delivery Guide
The Visual Studio Agile Deployment Assessment Delivery Guide is written guidance of the process of this
engagement for the delivering consultant, to assist in the development of their agile recommendations
and roadmap. Please note this document contains live links to resource and reference materials.
In addition to outlining the entire engagement, the Delivery Guide provides detailed instructions for
four essential on-site activities:
(1)
(2)
(3)
(4)
Understanding the current software development process
Eliciting the desired objectives for an agile implementation
Creating appropriate and tailored recommendations
Delivering the final agile recommendations and optional Leave-Behind Materials
Note: This DTDPS is not amenable to a single, standardized approach. Development organizations have
different needs, different environments and different goals. One type of solution does not match all
development teams. This means that discretion must be used when asking questions, following issues
and making recommendations. For instance, if the organization is considering Scrum, or an agile practice
similar to Scrum, then following the approach laid out in the sample recommendations may make sense.
Alternatively, if the organization is looking toward a continuous flow model, they may consider a Kanban
approach. However, there are key principles shared by all lean and agile methodologies. Namely, they
seek to reduce the time it takes to get feedback (at all levels of development), they seek to engage the
team members as key contributors working in high performing teams, they seek to improve overall
quality, and they seek to reduce cycle time (the time for an idea to become delivered code). This
document will focus on the shared principles, allowing the consultant to tailor the approach
appropriately to the customer’s organization.
1.1 Engagement Agenda
The three-day engagement involves a targeted agile assessment that assists in developing an effective
strategy for adopting or improving agility. Ideally, the implementation of the Team Foundation Server
agile planning tools will provide an excellent tool for improving a customer’s agility. The end result of
this engagement is a set of customized recommendations for improving agility. The table below includes
the suggested agenda for the engagement.
Day
Activity
Phase 1: Pre-Engagement Activities (Day 0)
0



Customer completes and returns the Project Scope and Customer Commitment doc.*
Customer completes and returns the Pre-Engagement Questionnaire.*
Customer responses are used to adjust the project scope (if necessary).
Phase 2: Kick-off and Presentation (Day 1)
1


Review Pre-Engagement Questionnaire with customer.
Conduct the Introduction to Microsoft ALM for customers*, or similar presentation
Page 3
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

Lead discussion to understand desired outcome of engagement.
Phase 3: Discovery Workshops and Interviews (Day 1-2)
1-2

Interview SMEs to understand needs of customer organization consisting of:
o Questions tailored to the organization based on the answers from the pre-engagement
questionnaire and initial discovery.
Phase 4: Agile Deployment Assessment Deliverable Development (Day 3)
3

Compile inputs from assessment to complete the Agile Deployment Assessment Template*.
Phase 5: Plan and Resources Delivery (Day 3)
5



Review final agile recommendations with Customer.
Install and walk through any leave-behind materials you deem useful for the customer.
Complete Partner Survey* and submit deliverables* to Microsoft for Payment.
*items that correspond to templates or documents available from DTDPS website
Page 4
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
2 Engagement Overview
Each of the engagement phases is briefly described in the following sections. For more detailed
information about each step, refer to the “DTDPS Service Guide”.
All engagement materials can be found through the partner portal at www.microsoftdtdps.com
1. On the home page, hover over Agile Planning Services, a second menu will appear to the right.
Select “Conduct the Engagement”
2. On the “Conduct the Engagement” page for DTDPS, select the appropriate engagement from
the green box.
3. A new page will open with all engagement materials listed in the right column under
“Resources.”
2.1 Phase 1: Pre-Engagement Activities (Day 0)
In Phase 1, “Pre-Engagement Activities”, the consultant and customer document the scope and
objectives for the overall engagement.
START here: http://www.microsoftdtdps.com/
>>Conduct the Engagement >> Visual Studio Agile Deployment Assessment >> Delivery Materials >>
Phase 1 Pre-Engagement Activities (zipped file includes the following)
Agile Deployment Assessment Project Scope and Customer Commitment (Word doc)
Agile Deployment Assessment Pre-Engagement Questionnaire (Word doc)
The specific tasks that are completed in this phase include:
 Review the customer’s business requirements
 Request that the customer complete the Pre-Engagement Questionnaire
 Review completed Pre-Engagement Questionnaire
 Review the ALM Evaluation and ALM Maturity Model (see Appendix A, B, and C) as
supplemental materials in preparation for interviews with customer
 Work with the customer to identify Subject Matter Experts (SMEs) - The consultant will work
with the customer to ensure that the roles and experiences of the SMEs, who have been
identified as interviewees, provide the proper breadth of understanding of the customer’s
processes and tools.
 Create an interview schedule
 Create a working-session schedule
Page 5
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
2.2 Phase 2: Kick-off and Presentation (Day 1)
START here: http://www.microsoftdtdps.com/
>>Conduct the Engagement >> Visual Studio Agile Deployment Assessment >> Delivery Materials >>
Phase 2 Kick-off and Presentation
Introduction to Microsoft ALM
In Phase 2, “Kick-off and Presentation”, the Lead Consultant will conduct a presentation to provide the
customer’s team a broad knowledge of Team Foundation Server and use the workshop discussion to
help the customer define their desired end state.
Tasks for this phase include holding a kick-off meeting with appropriate stakeholders and:
1. Presenting “Introduction to Microsoft ALM”
2. Demonstration of TFS Agile Planning tools and Portfolio Management tools
3. Q&A Session to gather requirements/scope
2.3 Phase 3: Discovery Workshops and Interviews (Day 1-2)
In Phase 3, “Discovery Workshops and Interviews,” the Lead Consultant interviews the SMEs, gathers
documentation, and develops initial findings. The “Agile Planning Delivery Guide” (this document) is a
reference that walks the consultant through the steps required to complete this phase.
The specific tasks that are completed in this phase include:



Gather documentation on existing development processes and tools.
Interview customer subject matter experts to complete the ALM Evaluation by utilizing the
previously completed Agile Deployment Assessment Pre-Engagement Questionnaire, as well as
other ALM Practices.
Develop initial findings
Team Foundation Sever is the collaboration platform at the core of the Microsoft Application Lifecycle
Management (ALM) Solution. It has fantastic support for agile development teams. However, in order
to get the most from a TFS deployment, a keen understanding of an organizations agile needs are
required. An agile planning assessment helps reach that goal.
For this Planning Services engagement, the Lead Consultant is expected to have a good grasp of agile
development methodologies, and be able to explain agile principles effectively. As noted above, this
engagement guidance does not provide a simple checklist, or Excel workbook, that will automatically
generate the bulk of the report. Instead, the consultant is expected to come up with a tailored set of
recommendations that meet the target customer’s needs. In addition discretion, understanding and a
bit of sympathy are required for a successful engagement.
Page 6
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
2.4 Phase 4: Agile Development Assessment Deliverable Deployment (Day 3)
In Phase 4, “Agile Development Assessment Deliverable Deployment,” the Lead Consultant reviews the
initial findings from Phase 3 with the customer’s leadership team, determines recommendations to be
included in the plan that are outside the areas addressed by the engagement, and writes the plan.
START here: http://www.microsoftdtdps.com/
>>Conduct the Engagement >> Visual Studio Agile Deployment Assessment >> Delivery Materials >>
Agile Deployment Assessment Leave Behind Materials (zipped file includes the following)
Agile Deployment Assessment Deliverable Template (Word doc)
During the first part of the session, the consultant should work with the customer to drill down on the
specific core capabilities for their needs and target the primary issues with the highest impact for
working session topics. The primary discussion topic should be around agile project and portfolio
management, and agile techniques. Other possible discussion topics include:





Code quality and promotion strategy
Build management and merging scenarios
Requirements management
Test management
Bottlenecks in the current software development process
For the second half of this phase, the consultant combines the learning from the engagement to write
the final agile recommendations.
The specific tasks that will be completed during this phase include:




Identifying initiatives appropriately based on the maturity level of the practices in place.
Generating a report based on the findings.
Inserting the report into the “Agile Deployment Assessment Delivery Template”:
Customizing the deployment plan by attending to the relevant sections such as the “Future
State”, “Key Areas for Improvement” and “Roadmap” in the document.
A consultant may elect to use his/her own deployment plan template, however the consultant must
ensure all relevant topics (as outlined in the provided template) are covered in the final plan.
Additionally all materials and resources listed on the first page of the “Visual Studio Agile Deployment
Assessment Recommendations Template” must be included in the final plan for the customer.
Page 7
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
2.5 Phase 5: Plan and Resources Delivery (Day 3)
In Phase 5, “Plan and Resources Delivery”, the Lead Consultant wraps up the engagement.
START here: http://www.microsoftdtdps.com/
>>Conduct the Engagement >> Visual Studio Agile Deployment Assessment >> Delivery Materials >>
Agile Deployment Assessment Leave Behind Materials (zipped file includes the following)
Agile Deployment Assessment Leave Behind Resource Guide (Word doc)
Materials and Resources (virtual links, pdf documents, PowerPoint decks, etc.)
In the final phase of the engagement, the Lead Consultant will:



Install and walk through any relevant leave-behind materials provided by Microsoft
(see section: Section 5, Upload Leave-Behind Materials)
Review the completed agile recommendations document with the customer
Submit deliverables to Microsoft for payment.
The final step in Phase 3 of the Agile Deployment Assessment is to upload leave behind materials for the
customer. These key resources are intended to help customers become more familiar with Agile
practices and further their decision to implement. During the course of the on-site activities, the
consultant may choose to upload these materials with the customer’s subject matter expert, in
preparation for a demonstration with the customer on the final day of the engagement.
A list of these resources is provided in the “Agile Deployment Assessment Delivery Guide” (below) as
well as in the “Agile Deployment Assessment Template.”










VS2012 ALM Demo VHD - This virtual machine (VM) includes Visual Studio 2012 Ultimate, Visual
Studio Team Foundation Server 2012, and a sample application along with sample data which
supports hands-on-labs. This VM includes everything you need to learn and/or deliver
demonstrations of many application lifecycle management (ALM) capabilities in Visual Studio
2012. (Available through the link above.)
Visual Studio Case Studies – A compilation of industry leading white papers and case studies
that describe the Visual Studio Application Lifecycle Management solution
http://www.microsoft.com/casestudies/
Team Foundation Server MSDN Page: http://msdn.microsoft.com/en-us/vstudio/ff637362.aspx
“How to” TFS videos on MSDN: How Do I? Videos
Microsoft Visual Studio Team Foundation Server 2012 Trial:
http://www.microsoft.com/visualstudio/eng/downloads
Team Foundation Server Team Blog: http://blogs.msdn.com/b/team_foundation/
VS and TFS ALM News: http://blogs.msdn.com/b/visualstudioalm/
Brian Harry’s Blog Site: http://blogs.msdn.com/b/bharry/
Channel 9 Team Foundation Server Videos: http://channel9.msdn.com/tags/TFS/
Visual Studio ALM Rangers: http://aka.ms/vsarunderstand and http://aka.ms/vsarsolutions
Page 8
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Appendix A - ALM Evaluation Guidance
Executive Overview
An Application Lifecycle Management (ALM) Evaluation provides customers with deep insights into the maturity of
their software development capabilities and recommends potential improvements to increase predictability and
success in their application development projects.
The ALM Evaluation offering has been developed to reduce customers’ implementation risks and problems by
identifying their current development processes and capabilities and delivering an actionable plan to implement
ALM best practices and tools.
The ALM Evaluation offering is aimed at the technical decision makers (TDMs) of enterprise customers. These
include Chief Information Officers (CIOs), Chief Technology Officers (CTOs), Lead Architects, Vice Presidents of IT
and Application Development, and Directors of Application Development in enterprises that have at least 50
developers.
Customer interest in the offering stems from the fact that application development is a complex task that, if not
performed systematically, has inherent inefficiencies due to the following:

The complexities of managing the workflow across groups (especially with geographically
dispersed teams)

The inefficiencies in development across projects and applications (due to the limited reuse of
components and services)

Limited management visibility into the project status

Managing software defects and the application lifecycle
Customers that participate in the TFS Deployment Planning offering can expect to receive a comprehensive report
that shows the state of their development capabilities today and provides both recommendations for how to
improve the maturity of their development capabilities and guidance on the expected productivity gains.
Vision Statement
The vision of the Application Lifecycle Management (ALM) Evaluation is to provide the customer with a prioritized
list of improvement initiatives designed to significantly improve how they develop software. This is accomplished
by assessing the customer’s current ALM maturity level and focusing on both their process and their tools. During
the engagement, The Consultant establishes credibility by identifying the customer’s current ALM capabilities
based on a thorough ALM knowledge, thereby enabling the Consultant to position follow-on work that rapidly
advances the customer toward the advanced stages of ALM maturity expanding their business efficiency.
Scope of Engagement
The scope of the engagement is determined early in the engagement by the Consultant. The Consultant must
determine the customer’s objectives for the evaluation and facilitate the choice of engagement type. The
customer can choose a standard or a more in-depth enhanced engagement, as outlined below. The standard
evaluation is a short-term engagement, good for initial trust building or as a discovery component of another
offering, like the Team Foundation Server Deployment Planning Service or a Production Pilot. The more
Page 9
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
comprehensive enhanced evaluation is well suited to customers who want a deeper understanding of their ALM
capabilities and are willing to invest the time and money to gain that insight.
The standard ALM Evaluation is administered by a skilled Lead Consultant. It contains four phases and includes
evaluation on four or five practice area capabilities. These practice areas capabilities are outlined in the preengagement planning and kick-off phase. Next, the Lead Consultant conducts up to 10 individual or group
interviews of knowledgeable employees (selected from testers, developers, and managers of the IT group) to
determine the current ALM state. Those findings are discussed, along with brief recommendations, in the Working
Session meeting. The Working Session meeting is customized to a few areas, based on the customers’ needs, as
well as evaluation findings. The final deliverable, the Customer Summary and Recommendations Report, provides
a prioritized roadmap for process and tool initiatives.
The ALM Evaluation phases, based on the engagement type, are outlined below.
Out of Scope
Several items are out of scope for an ALM Evaluation engagement, but possible as follow-up engagements.
Any activity that is not listed above is out of scope. Table 2 provides a sample of the most likely customer requests.
Each of these can be a follow-up engagement.
Out-of-scope Item
Installing and configuring
Microsoft Visual Studio®
Team Foundation Server
Process reengineering and
roll-out organization
Designing, building, and
deploying any custom
application
Follow-up Engagement

Microsoft Services ALM Team Foundation Server (TFS) Lab
Pilot

Microsoft Services ALM TFS Production Pilot

Microsoft Services ALM Project Management with
Microsoft Solutions Framework (MSF) for Agile Software
Development

Microsoft Services ALM Project Management with MSF for
Capability Maturity Model Integration (CMMI) Software
Development

Custom time and materials (T&M) engagement

Custom T&M engagement
Table 1: Most common customer requests
The consultant should carefully evaluate the areas that are out of scope. These areas represent good opportunities
for follow-on work as part of a custom Partner engagement. When the engagement is complete and the closeout
meeting takes place, it is a good idea to turn to the out-of-scope section of the work order and show all the things
that can be done if the customer would like to continue with a custom engagement.
Page 10
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Risk Assessment
When delivering the ALM Evaluation offering, it is important for you to consider (and manage) the inherent risks.
To help you, some of the most common risks are listed in Table 3, along with their mitigation strategies.
Risk
Impact
Mitigation Strategy
Disruptive or
passiveaggressive
customer team
members
Providing
misinformation, hiding
real issues, and
reducing the impact of
evaluation
recommendations.
The consultant must understand that some of the customer’s team
members will consider the engagement a threat to their
livelihoods. This guide provides specific tactics that can help to
mitigate the customer anxiety, but nothing can replace excellent
consulting soft skills. See Section 6.2, “Interview Groundwork.”
Management of
scope
The engagement is
“fixed fee,” so
accepting scope
beyond that in the
work order will
negatively impact
profitability.
Setting customer expectations during the Pre-engagement phase is
critical. Consultants need to control the expectations. The specific
areas to keep in mind are:

The depth of recommendations. With the short
period of time in this engagement, it will not
be possible to have a detailed roadmap with
implementation plans for each initiative.

The interviewing of additional teams or
practice areas beyond the agreed-upon scope.
Table 2: Common risks
Potential Customer Views
It is important to understand how the customer views the attribute performance functions and the ALM maturity
evaluation questioning. That insight will aid your understanding of the power of the impact that the ALM
Evaluation engagement has on a company and the reactionary attitudes of its employees. This understanding will
enable you to develop essential interview techniques for successful service engagements.
The ALM maturity evaluation is a tool that:

Facilitates process improvement

Analyzes the strengths and weaknesses (based upon industry best practices) of how things really work

Acts as an instrument for organizational change

Requires active and willing involvement and a group effort at self-analysis
Note: This is not an audit from an external party that gives a “report card.”
Typically, the senior management recognizes that there is an issue or crisis and asks an outside company to come
in and diagnose the obstacles and propose solution plans. An ALM Evaluation consultant is welcomed by these
managers, yet the employees you interview often have less welcoming and defensive attitudes. You may inform
them of the benefits just listed; however, often these employees feel that their jobs may be threatened by your
presence and as a result of your questioning methods. In addition to that stance, you are bound to encounter
some or all of the following common customer preconceptions during an ALM Evaluation engagement:
 If we “fail” the evaluation, I will look stupid in front of my boss.

Will I lose my job if I tell the truth?

I am being compared to Microsoft’s theoretical model.

I am just going to “game” the results (cover my weaknesses and show the best possible face).

I do not have time for this worthless exercise.
Page 11
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

I will find excuses to skip the midpoint and finding meetings.

Nothing is wrong with my group. I know we follow the best practices.
Overcoming Interview Obstacles
Going in with the knowledge that there is a problem to be found and someone to blame is not part of an ALM
Evaluation engagement. The goal of determining where the company is, in the ALM Maturity Model, plus where
they want to be takes a careful approach that focuses on the processes—not on the people—in an effort to
improve how the organization operates.
Approaching these evaluations with the mindset of a counselor who is there to listen and aid, helps you to obtain
both the results that are necessary for successful service relationships and the ability to offer future service and
product solutions. Go in as a jackhammer, digging up the road for the answers, and all you may get are tight-lipped
responses or an outright rebellion.
Remember to avoid firing off questions that can put the interviewee on the defensive. In preparation, make every
effort to internalize the capabilities, so that you will be able to ask questions in a conversational tone. Table 9
contains two examples of ways to conduct interviews, demonstrating the focus on the processes —over the
individuals and the decisions that they have made.
Capability
Code reuse
Description: Code reuse
covers the ability to
recognize what existing
code can be used in new
software projects. Code
reuse aids the software
development team by
reducing software
development time scales,
reducing costs, and
contributing to the
dissemination of
knowledge and expertise.
Customer
Operations
“Applications are
developed in silos, and
code is replicated
through each, so as
you can imagine we do
not really have
anything being reused,
or we use design
patterns and no
reference
architecture.”
Customer
Situation
Interview Focus
The developers get 
defensive because
they claim that
they know how it
should have been
done in the first
place, but
management
pushed the silo
approach to get
the applications
rolled out more
quickly for the

business users.
The ALM consultant (interviewer) must
focus on determining whether there is a
culture around code and software reuse
as well as around skills and competencies.
A line of questioning about the source
code location(s), which tools are used,
and where in the organization they are
used should give the interviewer a clear
indicator of whether the code is
centralized and whether there are any
processes in place for attempting code
reuse.

The interviewer should be able to
articulate examples of reusable code. The
instructor should question the
interviewer and possibly ask: Can you
give me some examples of reusable
code? Some answers to look for are:
The interviewer should try to determine
whether the customer understands the
value of both code reuse and the proper
architecture to enable code reuse, such
as patterns that promote reuse.


Platform libraries

Application frameworks
Reusable components, such as the
logging application block
Page 12
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

“Our source control
system is Microsoft
Description: Source control
Visual SourceSafe®. We
is the use of the
use this for everything,
capabilities of the source
including document
control repository to
storage. It is pretty
enable developer
well organized and
productivity while ensuring
things are versioned.
the safekeeping of the
We do, however, have
code.
problems with our
code base structure in
Visual SourceSafe. The
lack of branching
seems to be another
problem for us, and so
is concurrent
development.”
Source control
Code from software factory code
generation
Our developers do 
not check code in
very often; they
keep it on their
individual
computers. But,
our biggest

problem is the lack
of any traceability.
Interviewers should be looking for a level
of process and even process automation
around source control. They should also
examine whether heavy automation is
compensating for current repository
weaknesses.

Labelling and branching comprise an area
that the interviewer should thoroughly
understand when performing these
evaluations. This area should be given
thorough attention during the interviews.
The interviewer will be trying to
understand the customer’s level of
understanding and usage of labeling and
branching.

When discussing branching, a savvy
developer will most often pose a
situation such as: When working with a
new Version1.1 release with bugs fixed,
how do you make sure that those
updates will be reflected in the Version
2.0 code that you are currently working
on?

The interviewer should be asking
questions about the existence of a
branching strategy, how it was defined,
and what the criteria were.

The interviewer will be asking about
traceability. This person should try to
delve into what the customer thinks
about their level of traceability, where
traceability is lost, and the cause for that.
The interviewer should be able to
articulate what traceability means, as
well.
Interviewers should also focus on
determining whether the source control
structure is hampering possible build
optimization, and if so, why it is currently
that way. Does it affect dependency
management? An effective interviewer
should drill down into dependency
management and the problems around
that topic, such as circular references.
Table 3: Sample ALM Evaluation questioning
Page 13
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Evaluation Benefits
The customer views that were expressed earlier in this document are impacted by how the interview is presented
and conducted. What is important is not only the way that the interview questions are relayed, but
communicating the benefits that will be achieved during and through completion of the ALM Evaluation
engagement.
You might ask: Why is it important to understand the academic view of the evaluation? The answer is that you can
educate your customer about the many benefits by understanding the functions of an evaluation. This will aid in
defusing any negative attitudes held by any of the customer’s team members.
The Maturity Evaluation is an analytical tool that assesses the company’s ALM strengths and weaknesses. Although
you may know from the work order that a problem exists, there may be other issues to discover. These issues may
or may not be related to the primary reason for being there in the engagement. Only by completing the entire
ALM Evaluation engagement will you have the tools that are necessary to conduct the proper analysis.
The process of the evaluation provides a voice for change agents. Typically, staff members have a very good idea
of what should be changed, but they typically struggle to get their ideas heard. The evaluation gives them a voice.
The ALM Evaluation engagement provides staff members with an opportunity to better understand their own
organization, its practices, and its communications. It not only determines whether an organization is performing a
function, but whether they can reliably take localized success and generalize it across the organization to
continued and increased ALM success.
The engagement provides the organization a reference model that is based on Consultant experiences and
industry best practices. It aids in planning the company’s growth by prioritizing changes—indicating what to work
on first.
The evaluation process motivates self-analysis and growth in the following ways:

All ALM-related teams are required to be interviewed. Through this process of broad participation and
increased communication, the organization’s internal change is fostered by the education of industry-wide
ALM best practices that are discussed during the interview.

When openness is fostered during cross-functional interviews and the organization is broadly participating,
the process typically reduces “turf” protection and, at a minimum, makes it easier to spot. Your focus is to
treat and focus on the processes—not on the people—in an effort to improve how the organization works.

The ALM Evaluation engagement functions as a tool to implement process improvement plans. Members of
different teams may have similar ideas on how to improve things, but because they may be
compartmentalized, they may be easily dismissed. Interview participation from associated teams can
broaden the communication of these ideas and allow them to come together to form a force for change.
This can happen as a result of the ALM Evaluation engagement, the interview meetings, or both.

The ALM Evaluation engagement acts as an instrument for organizational change. The ALM Maturity
Evaluation aids organizations in mapping out the process for improvement, transforming how they work.
Unification efforts require leadership to get everyone to see things in the same way and to get middle-level
managers to see across borders.

The ALM Evaluation engagement helps company unification efforts, because the evaluation requires
leadership from the company’s senior management to set goals, monitor progress, provide resources, and
support change. In immature organizations, this can be a radical departure from the norm, where
management looks at software development as a mystery.

Evaluation results are fulcrums of positive change, transforming how the organization grows and achieves
higher ALM practices.
Remember that your function is to be the counselor, not the jackhammer, and to benefit the customer by:

Motivating self-analysis
Page 14
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

Conducting team interviews that require broad participation

Fostering internal change

Reducing turf protection or making it easier to spot

Fixing the process, not the people

Improving how the organization works

Maintaining confidentiality

Acting as a voice for change agents

Getting people to see the same thing in the same way

Involving senior management to help with efforts at unification

Hosting an unthreatening environment

Providing a safe environment to discuss the pros and cons of what they do

Thinking across boundaries

Offering additional beneficial Consultant services
With this knowledge, you will successfully reinforce the positive message to the customer’s staff through a careful
understanding of what fears you might face and how to defuse them.
Understanding the Customer’s Situation
Before starting the engagement, the consultant should get enough background information about the current
environment so that the engagement can be positioned within context. This should not be an in-depth analysis
(which is beyond the scope of this exercise), but it should capture how the customer currently feels. Some
examples of areas that should be covered are:

The current development tools across the enterprise, with an emphasis on the primary tools that are used
and the projects that will be focused on during the engagement.

The customer’s current “difficulties” with the software development process.

Any process improvement efforts that might be in process (and the methodology that the customer uses for
process improvement).

The general software development methodology. Does the customer run an MSF Agile shop or a more
formal (MSF CMMI) shop?

The current skill levels.

The current status of any initiative in the organization that relates to the application development process.
Understand whether the customer has a current technology roadmap. This may be either implied or explicitly
written. The customer’s roadmap and how it relates to the Microsoft roadmap is something that the consultant
delivering this engagement should consider, especially when preparing recommendations as a result of the
engagement.
Use the key findings from this exercise as inputs to the customer deliverable document that will be delivered at
the end of the engagement.
Page 15
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Appendix B - ALM Maturity Level Definitions
The ALM Maturity Model defines a set of ALM capabilities and four levels of maturity for each capability. Each
level of maturity brings with a distinct level of value and specific benefits to the customers organization. (See:
Table 10.) Similar to the Core Infrastructure Optimization (IO) and Application Platform Optimization (APO)
software application models, the ALM Maturity Model consists of four levels that range from the most basic to the
most advanced, dynamic maturity practices. The customer receives a score for each capability within each
maturity classification, based on how much of the maturity level they practice. The four maturity levels are
classified as follows:

A company at the Basic maturity level is typically homegrown and does not have documented processes;
has little to no cross-functional communications; and performs in an ad-hoc, informal manner. (These
companies are not professional development organizations, and they usually do not know the next steps for
developing software.)

A company at the Standard maturity level performs in a more uniform way but not 100 percent
consistently, such as when a few departments follow the process but you see that some of the other areas
do not. The company may follow best practices, but it is not receiving the value because of implementation
or commitment.

A company at the advanced maturity level performs the right process across the organization, has clearly
documented processes that are maintained, and follows best practices. This level is where most companies
strive to be.

The Dynamic maturity level is rarely found, because it is not feasible for most companies to be performing
at this level. Therefore, do not be alarmed or try to lift a customer into this maturity level since it may not
make economic or business sense for them to be there. The companies that qualify for this level generally
perform at the top of their industry and include the lower maturity levels in their practices.
This model aids your customers in understanding the key competencies that are related to successful ALM
adoption. The model also offers them a clear and targeted plan that is tailored to their companies’ objectives.
Table 10 provides the maturity characteristics that are associated with each level.
Maturity Level
Basic
Standard
Advanced
Characteristics






The development team uses homegrown practices.



Best practices are starting to be adopted.



The documentation of practices is slight or informal.

Tool usage is formal, with usage policies documented and enforced.
The practices are performed in an ad-hoc or informal manner.
The practices are undocumented.
There is little (or no) communication across teams.
Some key roles (such as QA) are not consistently performed.
Multiple development teams (or all of them) are starting to use consistent
practices.
The tools that are used are generally disconnected and not integrated.
There is a relatively informal use of the tools, with no usage policies
defined.
The use of tools that support best practices is pervasive across teams.
The tools are fully integrated into the integrated development
environment (IDE).
Page 16
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Dynamic



The documentation is formally maintained.


Portfolio management tools and processes are fully integrated.
Best practices are adopted and documented.
Development practices are highly innovative and demonstrate industry
leadership.
It is possible to track requirements and use impact analysis reports for
change requests.

Help desk quality metrics are used to track errors, turnaround time, and
the cost of maintenance.
Table 4: ALM Maturity Model levels
Page 17
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Appendix C - Maturity Model Practice Areas
The ALM Maturity Model Evaluation is based on eight practice areas of the ALM process. Inside each practice area
are the core capabilities and their attributes. You will be inquiring about and applying a score to each practice
area. You arrive at the maturity by calculating the scores of these eight ALM practice area attributes. Your
interview questions determine the scoring for each attribute in the practice areas, which are then combined to
determine the maturity level of the customer.
Table 16 lists the practice areas along with some related Basic through Dynamic maturity traits in more detail.
Practice Area
Architecture and Design
Maturity Level Criteria
Basic



The architecture is not properly documented.

Architecture creation does not consider deployment early in the design
process.


The understanding of the Architect role is unclear.
The use of modeling tools is inconsistent or non-existent.
No clear process exists for transforming business requirements into technical
requirements.
Use of the database modeling tool is inconsistent. There is limited database
architecture.
Standard

The Architect role is understood and clearly identified combined with other
roles.



Tools have been identified, and an early adoption phase exists.

Individual project teams have documented database architectures, but no
enterprise databases exist.
Some habits have starting forming, and some process consistency exists.
Documentation is maintained irregularly and inconsistently across teams or
projects.
Advanced






A dedicated architecture team exists.
Architectural tools take the deployment process into account.
Integrated tools are used across different teams and projects.
The use of practices and processes is capitalized on.
Practices and processes are applied across teams and projects.
Some or all of the database modeling is delegated to a dedicated Database
Administrator (DBA) team.
Dynamic



The process is formalized, documented, and has an architecture.

The organization contributes back to the development community both
internally and externally through the use of published articles, white papers,
and conferences.

Well-documented enterprise databases, consistent and versioned enterprise
data models, and feedback loops to continuously improve the process exist.
The inclusion of patterns and practices is consistent.
A clearly defined mechanism exists for sharing or forcing the usage of patterns
and practices across projects and teams.
Page 18
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Development
Basic









Developers perform only functional testing, if that.
Unit testing is left up to the discretion of the individual coder.
Little if any code reuse exists.
No code reviews by peers or mentors take place.
No metrics are available for code quality.
Code analysis is ad hoc, if it is used at all.
No published coding standards exist.
Database schema-naming conventions are ad hoc and inconsistent.
The database programmability and schema are not in source control.
Standard

Some unit testing takes place, but no formal, written, standard approach
exists.


Some developer-written unit tests appear in source control.


Key code is peer-reviewed.


Basic naming standards exist, but they are not consistent across all teams.
Handcrafted “one-off” test classes exist, but they are not integrated into the
integrated development environment (IDE) testing framework.
The testing policy is inconsistent and informal, but a willingness to improve
code quality through unit testing exists.
The use of comments in the code is inconsistent.
Advanced








There is consistent use and policies for unit testing that supports smoke tests.
Code coverage is used and consistently applied.
Static analysis is used.
Statistics are regularly captured.
Developer- written unit tests run after the QA build verification tests (BVTs).
Code is consistently and thoroughly reviewed.
Published coding standards are consistently applied.
The database is fully versioned in source control.
Dynamic
Software Configuration
Management




Benchmarks for quality targets exist.


Targets can be internally set.





Source control may or may not be used.


An unclear understanding of branch and merge concepts exists.
“Performance budgets” for developers exist.
The organization is an industry leader with published quality metrics.
The continuous monitoring of statistics and metrics for continuous
improvement exist.
The auditing of coding standards takes place.
Basic
Local copies of code exist.
The build process is manual and on-demand.
The build process is not documented.
No traceability exists among the build, the content or work performed, and
the requirement.
Check-in occurs irregularly.
Page 19
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013
Standard





The source-control tool usage is not integrated.
A dedicated build computer exists.
The build process is informal and undocumented.
Branching and merging are understood by the lead integrators.
Daily or regular check-in is performed.
Advanced







An integrated source-control tool in an IDE is used.
A dedicated configuration management role exists.
The build process is formal and documented.
Build metrics are regularly published.
On-demand builds are enabled.
Unit tests run after BVTs.
The database is built from source-controlled components as part of a build
process.
Dynamic
Deployment and
Operations





Highly sophisticated build scripts exist.

Little or no communications exist between the operations and development
teams.


No formal help desk or bug tracking process exists.

Infrastructure deployment issues are identified and resolved at deployment
time.

The environments— such as those for development, pre-production, testing,
UAT, and production— are not segregated.



The build promotion schedule is ad hoc.
Multiple internally and externally produced code modules are integrated.
Build-outcome alerting and monitoring takes place.
All schema objects are unit tested prior to a schema change being completed.
A scripted tool is used to simultaneously deploy schema changes to multiple
environments
Basic
No tools are used; operations are based on e-mail messages with manual
follow-up.
Build promotion is unregulated.
The environment is undocumented.
Standard

The help desk incident tracking tool (for training, user issues, and
infrastructure) is a stand-alone one.



Incidents are not integrated with TFS bug tracking nor escalated to TFS.




The automation and validation of build deployments is limited.
Some monitoring is in place.
Some procedures and/or approval processes are in place for build
deployment.
The Deployment Manager role has been identified.
The infrastructure is documented.
The environment is segregated, but ownership is unclear.
Advanced

The help desk is integrated with the bugs in TFS.
Page 20
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

Monitored instrumentation is hooked into the infrastructure and the
applications.

Tools exist to deploy and validate successful build deployments, smoke tests,
testing scripts, and data generation scripts.


An approval process with traceability integrated into the tools exists.


The infrastructure architecture is documented in integrated tools.
A clear cross-functional team (of infrastructure-development team members)
has been identified.
The environments are segregated, ownership is clearly defined, and promotion
procedures are well understood and consistent among the environments.
Dynamic
Governance

Help desk quality metrics are used for turnaround time, the cost of
maintenance, and the identification of error-prone subsystems.



Deployment is automated.






Projects are started with limited justification.
Monitoring is proactive and ongoing.
IntelliTrace data collectors exist on production environments to enable
IntelliTrace snapshotting for developers.
Basic
Projects are funded based on key influencer opinions.
No return on investment (ROI) evaluation or retrospective takes place.
No portfolio review process takes place.
No compliance program or target is in place.
No process improvement initiative is in place.
Standard




The certification for the chosen compliance program is informal.
Compliance certification is applied and inconsistently monitored across teams.
Processes use semi-manual tools (for example, Microsoft Excel® lists).
The use of initiative targets is random.
Advanced


The certification for the chosen compliance program is formal.


Cross–team resources are managed and their time is assigned.
Portfolio management techniques are used, but portfolio and project
management tools are not necessarily integrated.
Governance is integrated with the certification and compliance program.
Dynamic
Project Planning and
Management



The portfolio management tools and process are fully integrated.

The ROI and retrospective are supported by metrics.


No formal stakeholder communications plan is in place.

Team coordination is informal, and task assignments are made verbally or by
using e-mail.
The organization has Microsoft Project Server integrated with TFS.
Participation occurs in the creation and review processes of industry standard
compliance programs.
Basic
The processes for estimation, planning, risk management, and scope are
informal or non-existent. An instinctive and intuitive approach is used.
Page 21
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

Financial information is not evaluated by the Project Manager on an ongoing
basis.

No clear Project Manager responsibility is defined.
Standard





Microsoft Project may be used.
The use of Project is individual, not integrated, and not standardized.
Tool usage is dependent on the strength of the individual Project Manager.
Financial information is manually evaluated by the Project Manager.
The Project Manager responsibility is clearly assigned.
Advanced

The integrated use of TFS exists for the management of bugs, tasks, and
change requests.

EPM is used for financial and resource tracking through EPM (Project Server)
to TFS integration.

External resources, stakeholders, and partners share project information and
have integrated tools—such as Microsoft SharePoint® and Microsoft Visual
Studio® Team Web Access — to perform their roles in the project


Dedicated Project Managers exist.
Constant feedback from the product owners, and end-users, to the
development teams is easy and quick.
Dynamic
Testing and Quality
Assurance

The full integration of Portfolio Manager are used to establish the project’s
health across the organizations.



Initiative impacts and requirements are viewable across the organization.


No dedicated quality assurance (QA) team exists.




No quality metrics exist.
Metrics are used to drive projects and aid in estimation and re-estimation.
A Program Management Organization (PMO) is in place.
Basic
Ad-hoc functional testing, which is closer to debugging than testing, is
performed by the development team.
The fix and deploy cycles are long.
The regression bug rate is high.
Test data is not abstracted and no test data generation is available.
Standard





A dedicated QA group has been staffed.
A test planning process has been defined.
Non-integrated testing tools are in place.
The testing procedures and environment are informally documented.
Progress tracking is rudimentary.
Advanced




The organizational culture is accepting of defined testing policies.
Test planning begins at the requirements phase.
Testing is a measured and quantifiable process.
Integrated tools generate publishable metrics.
Dynamic

A test process improvement group and tools are in place.
Page 22
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013

The organization has industry leadership on evaluating potential testing tools
and strategies.


Defect prevention is practiced.

The use of automated testing tools is audited and required prior to the
deployment of schema changes.
Requirements Engineering Basic
and User Experience

Testing is based on statistical sampling and measurements of confidence,
trustworthiness, and reliability.
Broad assumptions are made by the development team that they know what
to build.

They work in an ad-hoc manner, using their own initiative, perceptions, and
ingenuity.




Little or no written requirements exist.

Little or no customer feedback on user experience.
Little or no validation with stakeholders takes place.
No User Experience (UX) role has been defined.
User documentation is either non-existent or not standardized (which means
user documentation is on demand and as needed).
Standard




The quality and format of documented requirements are consistent.

User-centered design principals are understood, but supported by
disconnected tools.

Some consistent user documentation exists.
The versioning of requirements is enabled and tracked.
The requirements are accessible to all the stakeholders and team members.
Some non-integrated tools are used for user interface (UI) modeling and
prototyping.
Advanced

Multiple types of requirements—for example, functional, non-functional,
business-system, and feature—are captured.


Requirements relationships are tracked, and tasks are traceable.

Multiple types of requirements—for example, functional, non-functional,
business, system and feature—are captured.


Requirements relationships are tracked, and tasks are traceable.
Fully integrated tools that use requirements coverage analysis reports are
used for traceability.
Fully integrated tools that use requirements coverage analysis reports are
used for traceability.
Dynamic



A product Change Control Board has been instituted.




UX experts incorporate the latest and best UI principals.
Impact analysis reports for change requests are used.
Published metrics exist for new requirements, implemented requirements,
tested requirements, and requirement change requests.
The continuous improvement of UX for subsequent use occurs.
UX designers have a good understanding of the technology and its limitations.
The designers are able to understand the intersection of ease of
implementation versus a great-looking UI.
Table 5: ALM Maturity Model practice area criteria
Page 23
Agile Deployment Assessment Delivery Guide, Version 1, 10.2013