Implementing a program and then waiting a year to find out if that program has made a difference may not be a practical response to scrutiny from the executive suite. A better way to respond is to include forecasting when planning investments intended to develop human capital. Forecasting impact and ROI illustrates what resources, processes, and supports are necessary to achieve impact. This session will show you how to adeptly take evaluation methods and move these earlier in the program’s life-cycle. We will demonstrate how to apply ROI evaluation methods to forecasting and complement these methods with additional methods. In particular, we will cover the use of logic modeling to map the causal chain of events from investment to impact and ROI. A case example will illustrate each of the steps of assessing the economics; namely, defining current and anticipated costs, estimating future benefits, calculating the future Return on Investment (ROI), and conducting a sensitivity analysis.
Vip Mumbai Call Girls Marol Naka Call On 9920725232 With Body to body massage...
Forecasting - Estimating the future value of training investments: Creating conversations to enable computation
1. Daniel McLinden, EdD
Assistant Vice President, Education & Training
Cincinnati Children’s Hospital Medical Center
Assistant Professor
Department of Pediatrics, College of Medicine
University of Cincinnati
ASTD – ICE
Monday May 7, 2013
12:30 – 1:45
Session M124
Room 102/104
Forecasting - Estimating the future value of training
investments: Creating conversations to enable
computation
2. Objectives
• Describe the benefits of forecasting.
• Conceptualize program impact
• Estimate the future value of outcomes, combine
benefits with costs to calculate the ROI and conduct
sensitivity analyses on the results.
• Estimate the future value of a more competent
person.
• Understand the role of the evaluator as the facilitator
of conversations among stakeholders about program
features, benefits, and alternative investments.
4. The Evaluation Model(s)
Satisfaction Learning Application Outcomes Economic Impact
Kirkpatrick
Phillips
Participation
Belfield
Belfield C, Thomas H, Bullock A, Enyon, R, & Wall, D. (2001). Measuring effectiveness for best
evidence medical education: a discussion. Medical Teacher, 23(2), 164–70.
Kirkpatrick DL. (1994). Evaluating training programs: The four levels. San Francisco Berrett-Koehler.
Phillips JJ. (2003).Return on Investment in training and performance improvement programs. Boston
Butterworth – Heinemann.
5. Evaluation lets us answer …
• Are stakeholders satisfied with the content and
delivery of the program?
• Did the program’s participants acquire knowledge
and skills during the program?
• Are program participants implementing knowledge
and skills acquired from the program in a work
setting?
• Is the application of knowledge and skills having an
impact on the organization’s business measures?
• Does the monetary value of the benefits exceed the
cost of the investment?
6. Why evaluate impact on outcomes?
“ASTD estimates that U.S. organizations spent $125.88 billion on employee
learning and development in 2009 (p. 5).”
“The pursuit of effective learning evaluations continues to be one of
the most challenging aspects of the learning function … Although Kirkpatrick’s
Level 1 is the most commonly used type of evaluation, only 36 percent of
respondents who use it said it had high or very high value. In comparison,
evaluation of behavior (Level 3) and evaluation of results (Level 4) were rated
the most valuable by 75 percent of respondents. (p 21).”
ASTD State of the Industry Report
“8% and 4% of CEOs indicated that Impact and ROI, respectively were being
reported. 96% and 74% of CEOs indicated that these measures should be
reported. These two items were ranked #1 and #2 among eight measures of
training.”
Phillips, J. & Phillips P. (2011). Measuring For Success: What CEOs Really Think About Learning Investments.
7. Why evaluate impact on outcomes?
Why forecast impact on outcomes?
“Numbers numb, jargon jars and no
one ever marched on Washington
because of a pie chart. Tell stories.” --
Andy Goodman, Good Ideas for a Good Cause
8. McLinden, DJ (2010). Estimating the future value of training investments. In P.
Phillips (Ed.). ASTD Handbook for Measuring and Evaluating Training.
Alexandria, VA: American Society for Training and Development
Phillips, J. J. (2003). Return on investment in training and performance improvement
programs. Boston: Butterworth-Heineman.
Phillips, J.J. & Phillips, P.P. (2010). The consultant’s guide to results-driven business
proposals: How to write proposals that forecast impact and ROI. New York:
McGraw-Hill.
Swanson, R. & Gradous, D. (1988). Forecasting Financial Benefits of Human Resource
Development. Jossey-Bass.
How?
Methods to create a conversation about future impact and value
11. Harvey, J.B. (1974). The Abilene paradox: The management of agreement. Organizational
Dynamics, 3(1), 63-80.
It is agreement that
leads to trouble
12. A (flawed) shared mental model
Outcomes
Costs
Analysis
Theory of
change
Cause Effect
13. Long term
outcomes
Intermediat
e outcomes
Short term
outcomes
Outputs
Inputs
Results Assumptions Risks
Present Future
When
If agreement leads to trouble, then
provoke disagreement
Ask provocative questions
Make provocative assertions
14. Tools to organize and make apparent a shared mental model
about investments for impact and value
To see and extensive list of free and commercial software to help create and portray a theory of
change, search for “List of concept mapping and mind mapping software” on Wikipedia or use this link
(http://en.wikipedia.org/wiki/List_of_concept_mapping_and_mind_mapping_software)
15. What’s your logic*
Life cycle Timing:
When will
this occur?
Results: What
changes will
be observed?
Assumptions:
What
conditions
must exist?
Risks: What
barriers will
prevent
success?
Long term
outcomes
Intermediate
outcomes
Short tem
outcomes
Outputs
Inputs
* See note in resource list if interested in a copy of this tool.
16. Activity: Creating a theory of change and impact.
This page Intentionally blank and other pages that pertain to the activity are omitted.
18. Calculating the cost is simple, getting people to agree and share a
mental model of the investment required is a challenge
A separate worksheet for each phase:
(1)Planning & Design
(2)Development
(3)Delivery and performance support
20. Monetary Outcomes
Business
Outcome
Description Value of the
business
outcome per
Unit
Number of
units
Impact Value Attribution
Percentage
Contribution Value
Cost Avoidance $ # %
Revenue
enhancement
$ # %
Total
Total Attribution
Value
Nonmonetary Outcomes
Business outcome Description
Estimate the monetary outcomes, if any.
Estimate the nonmonetary outcomes
Impact = Value
of the outcome
* Number of
units.
Contribution value =
Impact value *
Attribution
Calculating the monetary values is simple, getting people to agree
and share a mental model of the anticipated impact is a challenge
21. If monetary outcomes exist and can be attributed to
the intervention then do the math (it finally gets easy)
Description Current values
Design Total cost
Development Total cost
Delivery Total cost
Total Cost Sum(Design, Development, Delivery)
Monetary Outcomes Total Attribution Value
Net Benefit Total cost – Total Attribution Value
Benefit Cost Ratio Total $ outcome/Total cost
ROI Net benefit/Total cost
23. Account for the risk in estimating future activities and outcomes
Ask “what if…”
Revisit assumptions and, possibly, alter the shared mental model (challenging again)
Description Current values % of Current
Values
Revised
Values
What if
Personnel
Costs were
25% Higher
What if
Monetary
Outcome were
reduced by
25%
What if Personnel
Costs increase by
25% and Monetary
Outcomes were
reduced by 25%
Total Design
Total Development
Total Delivery
Total Cost
Total Monetary
Outcomes
Net Benefit
Benefit Cost Ratio
ROI
What if scenarios
• Overall costs in the program’s life cycle are higher
• Overall benefits are lower?
• Personnel costs are underestimated?
• Value of outcomes is overestimated?
• Both personnel costs are higher and outcomes are
lower?
Because
• People infuse desire and emotions that inflate or
deflate their beliefs about the numbers.
24. The formula
U= (N)(T)(SD)(d)-C
N=number of
people
T= duration of
training effect
C=cost of the program
SD= standard deviation of the
variation in job value
d=magnitude of the effect
Utility
is (almost) the same as the ROI approach
25. The ROI calculation is (almost) the same as
utility analysis
Monetary Outcomes
Business
Outcome
Value of the business
outcome per Unit
Estimated units
improved per year
Impact Value Attribution
Percentage
Contribution
Value
More
competent
people
Additional value of a
trained person, their
increased value.
Number of
people trained
$ % $
Total Cost $
=
ROI=
Net Benefits
Cost
* 100
Benefits - Cost
Cost
* 100
26. Case example: Using utility analysis to value quality improvement
This page Intentionally blank
27. Dynamic models: An experiential method to test assumptions
about planned investments
Allow stakeholders to experience (fly the simulator) and test changes
to assumptions and beliefs.
Dynamic models include:
• Non-linear effects.
• Feedback loops that accelerate growth or decline; the tipping point
effect.
• Lags in time (e.g., costs are immediate, benefits lag).
• Other important variables (e.g., attrition).
• Soft variables (e.g., satisfaction).
Sterman, J.D. (2000). Business
Dynamics: Systems Thinking and
Modeling for a Complex World.
Boston: Irwin McGraw-Hill.
28. Timing
Focus
Confidence in the data and findings increases over time
Prior to
implementation
During
implementation
Planning for impact
Following
implementation
Evaluating plans
Evaluating
Results
Business
Impact
When?
Measuring impact
29. 29
Forecasting Impact & Value
• Basarab, D. (2011). Forecasting the value of training. Performance Improvement, 50(3), 22-27.
• McLinden, DJ (2010). Planning and training impact: Evaluating the future value of training investments. In P. Phillips (Ed.). ASTD
Handbook for Measuring and Evaluating Training. Alexandria, VA: American Society for Training and Development
• McLinden, D., Phillips, R., Hamlin, S. & Helbig, A. (2010). Evaluating the Future Value of Educational Interventions in a Healthcare
Setting. Performance Improvement Quarterly, 22(4), 1-11.
• Phillips, J.J. & Phillips, P.P. (2010). The consultant’s guide to results-driven business proposals: How to write proposals that forecast
impact and ROI. New York: McGraw-Hill.
• Cragier, K., McLinden, D. & Casper, W. (2004). Collaborative planning for training impact. Invited article for a special issue on the
Contributions of Psychological Research to Human Resource Management. Human Resource Management, 43(4), 337 – 351.
• Swanson, R. & Gradous, D. (1988). Forecasting Financial Benefits of Human Resource Development. Jossey-Bass Evaluating
Cost and Cost effectiveness and Cost Benefit Evaluation
• Boardman, A. E., Greenberg, D. H., Vining, A. R., & Weimer, D. L. (2006). Cost-benefit analysis. Upper Saddle River, NJ: Pearson
Prentice Hall.
• Kee, J.E. (2004). Cost-effectiveness and cost-benefit analysis. In J. S. Wholey, H. P. Hatry, & K. E. Newcomer (Ed). Handbook of
practical program evaluation. San Francisco, CA: John Wiley & Sons.
Evaluating the ROI for training
• Phillips, J. J. (2003). Return on investment in training and performance improvement programs. Boston: Butterworth-Heineman.
Utility Analysis and other approaches for evaluating financial impact.
• Cascio, W.F. (1989). Using utility analysis to assess training outcomes. In I.L. Goldstein. (Ed.), Training and development in
organizations(63-88).
• Cascio, W.F. & Boudreau, J.W. (2008). Investing in people: Financial impact of human resource initiatives. Upper Saddle River, NJ:
Pearson Education, Inc. Link to website also includes software tools.
Telling the story of future change and impact – building a theory
• Hodges, T. (2002) Linking Learning and Performance: A Practical Guide to Measuring Learning and Performance, Butterworth
Heinemann.
• Knowlton, L.W. & Phillips, C.C. (2009). The Logic Model Guidebook: Better Strategies for Great Results. Thousand Oaks, CA: Sage
Publications.
• Kraigier, K., McLinden, D., & Casper, W.J. (2004). Collaborative planning for training impact. Human Resource Management, 43(4), 337-
351.
• McLaughlin and Jordan (2004). Using logic models. In J. S. Wholey, H.P. Hatry & K.E. Newcomer (Ed). Handbook of Practical program
evaluation San Francisco, CA: John Wiley & Sons.
* Contact me directly if to obtain access to the dropbox folder containing the Forecasting tool.
Additional resources for more information
30. Forecasting: Estimating the future value of
training investments
Contact information:
Daniel McLinden, Ed.D.
Cincinnati Children's Hospital Medical Center
3333 Burnet Avenue
ML 3026
Cincinnati, OH 45229-3039
Office: 513 636 8933
Mobile: 513 739 9087
Office Email: daniel.mclinden@cchmc.org
Personal: dmc@dmclinden.com *
Skype: danmclinden
* Use this email to request access to dropbox.