Previous Questions and Answers
What data-based research provides
evidence that I can pass on to a client that shows that doing
a formal needs assessment up front will reduce training development
project time and costs, sharpen the focus of the training, and
lead to increased training success as measured by learner responses,
instructional outcomes, improved on-the-job performance, and
perhaps by contributing to organizational goals? I look forward
to your answer - as does the entire ID class at IIT.
Wonderful question, Phil, but
I don't have an answer for you that
isn't anecdotal. In my books and articles I point to many instances
where the client intended to do X, but through study determined
that Y, Z and Q were more appropriate. That not only resulted in
less training, it also focused the efforts on the right content and
Alas, it's a very difficult study to do in a controlled way.
But you knew that. My office mate and I were immediately provoked into
a conversation about how to do that study and eventually decided
that the better the controls and reliability the less useful and
valid the findings.
Can I ask a question that is
not about your book? My question is 'What if you have a supervisor
that does not have time for her co-workers or people that she
is over- seeing?' What steps would you take to resolve this
Well, my book is most definitely
not about the answer to this question, although I imagine there
are many Jossey-Bass books that are. Here are my two cents, but take them with
several grains of salt,
since my expertise isn't in this area directly.
--- Talk to your supervisor
about this. Don't label him or her a failure, but ask for what
you need, such as more feedback on your X or more examples of
a good Y.
--- Look at the job description
associated with the supervisory slot. It might serve as a good
jumping off point for conversation. But don't hold it up and
point to it. That probably won't go over so great.
--- Cultivate a positive relationship
with him or her. People will be more likely to take suggestions
from those they perceive as favorably disposed to them.
--- Focus on the work and the
measures associated with your unit.
You and your supervisor can
work together on that, since you share those goals.
Take a look at the larger Jossey-Bass/Wiley
list. I bet there are
some texts that are associated with this issue.
How effective is PA in organizations
that lack a common vision or purpose?
It's difficult to do a performance
analysis in an organization without shared directions. Why?
First you've got problems as you attempt to define where you
are going and what it looks like, in broad strokes and details.
Who gets to define what optimal will look like? How much will
it shift with time and source?
Then, once you've forced some
decisions about shared directions, you get to the problem of
putting solutions in place. There's hard lifting required and
it's best to have the commitment early and deep, rather than
at the last moment, when you want somebody to do something.
What can be done when an organization
is already latching on to a solution before you are brought
in? Can a performance analysis help them step back and take
a freh look?
My questions are driven more
by the challenge I'm facing than by the fact that I'm doing
an interview. Good questions seek information about directions,
current situation and drivers. See the FTF web site (www.jbp.com/rossett.html)
and pages 59-61, 97, and 101-105 in the book, First Things Fast,
for many examples.
What makes the interview enables
special is the ability to follow up with related questions.
For example, if you ask about what the sponsor is hoping will
happen as a result of training, and she says,
"better teaming skills," then a natural follow up
question is, "What would those teaming skills look like,
how would they be acting if they had them?"
Hope that helps.
I am wondering what type if interviewing
you do when
performing a needs analysis. Do you first come up with a list
of questions and ask all participants the same questions, or do
you have a general question that starts a discussion of the issue
My questions are driven more by
the challenge I'm facing than by the fact that I'm doing an interview.
Good questions seek information about directions, current situation
and drivers. See the FTF web site (www.jbp.com/rossett.html) and
pages 59-61, 97, and 101-105 in the book, First Things Fast, for
What makes the interview enables
special is the ability to follow up with related questions. For
example, if you ask about what the sponsor is hoping will happen
as a result of training, and she says,
"better teaming skills," then a natural follow up question
is, "What would those teaming skills look like, how would
they be acting if they had them?"
What is the best way for someone
who has been involved in developing interactive training products
for years to make a career transition to a performance improvement
While no one good way, there are
several paths open to you. First, read, read, read. I like the
Dana Jim Robinson's books on performance, what Judith Hale recently
wrote, Mager classics, and, of course, my First Things Fast. In
addition, one you've read some of the basics, Stolovitch and Keeps'
Handbook is a must.
I like courses too. It helps to
have direction as you read and people to talk with about the new
ideas. Visit your local universities and see what they have have
to offer. The class might be called Performance Technology or
Performance Improvement or Organizational Development or Systems
I think that ISPI (the International
Society for Performance Improvement) is also a wonderful place
to start your transition. They have conferences and other related
events that will move you forward. Visit www.ispi.org.
I wish you good luck.
Are there software tools for Needs Assessments I attended your Needs
Assessment class a few years ago and I recall you talking about such a
software tool. I have been frustrated by web searches and literature searches
in trying to find software assessment tools. Can you help?
Yes, there are some tools available. Since you're sending your
question via the First Things Fast web site, then you know we provide
some performance support here, particularly for planning your
approach to analysis. There's also many suggested questions at the
site and in the book.
Beyond our site, why don't you take a look at BNH Software in
Montreal? Visit bnhexpertsoft.com.
I really like zoomerang. It's a nice, friendly online survey tool
that will allow you capture data anonymously. Check out
www.elisten.com has some nice possibilities too.
Zoomerang and elisten are only useful if you know what questions you
want to ask. That's always the bigger challenge during analysis, as
How does Performance Analysis differ from Needs Assessment?
That's an important question and I cover it in great detail in the
first few chapters of the book, First Things Fast. There I also talk
about why I think it a distinction worth making.
Let me summarize here, but do take a look at the lengthy coverage of
this topic in the book.
There are, of course, similarities between training needs assessment
and performance analysis. First, they both represent methods for
figuring out what to do, although at different levels of detail and
with varying proximity to the solution. They are efforts to
understand and serve customers, to figure out what's they want,
what's currently happening and how to improve the situation. And they
are based on asking questions of sources (Rossett, 1999, 1987;
The difference is where they are in the food chain and how
much is known prior to commencing. Performance analysis is what you
do first and fast, as we take the pulse of the people and
organizations. Performance analysis assures that we find or build
the right thing(s) for customers. Training needs assessment is the
study that helps us actually generate the right products, services
and relationships. PA comes first and yields a plan. TNA then
follows, focusing on situations where education, training and
information are appropriate; a tangible intervention is based on the
interactions that occur during TNA.
The specific performance analysis questions for different types of
issues and other tools you offer are useful, but what
would be more useful for me is to develop the traits of a good
performance consultant, like the traits of critical
thinker. This way it becomes part of my everyday life. What are the
traits of a good performance analyst? My guesses are
empathy, humility, perseverance, self-discipline. How does one
develop the traits of a good performance analyst Will
just practicing the questioning skills lead to the traits?
Interesting question, Jon. I think both Dana Robinson and Harold
Stolovitch have done some writing on that very issue.
Let me comment on some of the traits I think most germane, beyond the
obvious (and critical) ones of communications, persistence and
Here are some traits I think MOST linked to effective performance
analysis: CURIOUSITY, SKEPTICISM, PLANFULNESS, IRREVERENCE, CUSTOMER
FOCUS. Of course, far better to be conceptual and quick, rather than
plodding and yoked to rules. And then, to convert findings to
meaningful systems, some skills at writing and making a business case
and marketing your ideas would be critical.
I wish that saying the questions would transfer them into reality.
Naaaa. I don't think so. Add the questions to the traits. That
would be good.
My career so far has focused on employee development, so Ive
always had access to expert and novice learners.
My new relatively small company produces software and my learners are
those who buy the product. I have access to in-house
expert users but really need to observe novices. Apart from
interviewing third parties such as tech support and sales
staff, how do companies arrange to interview, survey or observe
novice users of their products?
You can think about it as if it were usability testing. Couldn't you
tag along as your engineers and programmers try out their efforts
with individuals who are 'virgin' to the product.
Where do you get such novices? Well, organizations do many things to
attract such users. They invite novices from other branches of the
organization. They advertise and pay them to participate or provide
free software for those willing to test drive it. Years ago, when we
wanted to test out a brochure, I sent graduate students out across
the campus to show it to random students, to observe and query for
Yes, you are setting it up so it's not au naturel. But you are
getting that novice or virgin look at the materials which will tell
you much about where the training and information support needs to be.
Is it ever important to share learning objectives with top management or
should we only share the terminal objectives with this group?
If an executive expresses interest, sure. But few will. They are
interested in their organizational strategies, in the results that matter
for the organization. Even if their org is a school, what they will care
about is how well the kids are reading, how many books they are taking out
of the library, attendance. Learning objectives are enablers. Hook them
to meaningful performance and results.
Our Training Department generally accepts the need for performance
analysis when a line manager calls and says, Come train my staff on everything
or something similar. However, my peers just dont see the need for
performance analysis when it comes to rolling out training for an entirely new
skill or program that is ordered from the top of the organization. Can you
suggest some ways to convince my peers of the value of a PA in these instances?
Rather than attempting to convince them to let you do a performance
analysis, why don't you focus on the questions that need answering and how
important it is to have many people/organizations' fingerprints all over
Start with the questions: what exactly is this new whatever they are
attempting to rollout? What problems does it solve? What opportunities
does it further or create? When will people get a chance to use it? What
is most important about it? How is it like and unlike the old or related
whatevers? What will they be doing and thinking when using it?
You'll also want to know where employees and their supervisors are on it.
Are they already somewhat knowledgeable? Are they eager? Confident?
Ready? Resistant? And their supervisors.... where are they about it?
Now, maybe, the sponsor can answer all these questions convincingly,
without stumbling. Typically, she/he can't. And most will acknowledge
that the REAL sources need to give you their perspectives, both for quality
of responses and the politics of engaging people in the programs that will
Several chapters in First Things Fast address this topic. Last January I
published an article in Performance Improvement on this very topic. It's
called "Communicating with the People in the Organization Who Aren't Us."
I wish you well.
You have written serveral articles on the changes to instructional
design, where can I access these on the web?
Well, Erika, hope you're not sorry you asked.
Few of my writings are pure ID articles. Most focus on analysis or
performance systems, not just ID.
Let me draw your attention to the Barnett and Rossett piece in Training
magazine, as well as the Training and Development article, with a live
link, called, "That was a great class, but..." There are others as well as you can see below.
Rossett, A., & Marshall, J. (1999). Signposts on the road to knowledge
management. In K. P. Kuchinke (Ed.), Proceedings of the 1999 AHRD
Conference:Vol. 1 (pp. 496-503). Baton Rouge, LA: Academy of Human
Rossett, A. (February 1999). If they resist, then you insist. Inside
Technology Training, 3(2), 45-47.
Rossett, A. (January, 1999) Understanding the people in the organization who
aren't us: Communication strategies for analysis. Performance Improvement
Journal, 38(1), 16-19.
Watson, J. W. and Rossett, A. (in press) Guiding the independent learner
in web-based instruction. Educational Technology.
Rossett, A. (August 1998). Viewpoint: No cheers for the corporate U.
Training, 35(8), pp 95-6.
Rossett, A. Keenan, C. , & Adgate, G. (September 1997). Aztechnology Turns:
A World Wide Web soap opera about change in the profession. Performance
Improvement, 19 (1), 34-40.
Rossett, A. (July 1997). That was a great class, but.... Training and
Development Journal, 51(7), 18-24.
Click here to read this article online.
Fulop, M., Loop-Bartick, K. & Rossett, A. (July, 1997).Using the internet
to conduct a needs assessment. Performance Improvement, 36(6), 22-27.
Rossett, A. (March 1997). Have we overcome obstacles to needs assessment?
Performance Improvement, 36(3), 30-35
Marshall, J. & Rossett, A. (January 1997). The learning community: How
technology can forge links between home and school. The American School
Board Journal, 181(1), A20-A24.
Rossett, A. & Barnett, J. (December 1996). Designing under the influence:
instructional design for multimedia training. Training, 33(12), 33-43.
Rossett, A. (April 1996). Training and organizational development: siblings
separated at birth. Training, 33(4), 53-59.
Rossett, C. & Czech, C. (1996). They really wanna but... the aftermath of
professional preparation in performance technology. Performance Improvement
Quarterly, 8(4), 114-132.
Field data suggests a high positive correlation between fast analysis and
useful information. Q1. Does your experience support or challenge this
conclusion? Q2. How do we convince performance technologists not feel guilty
when they discover important, useful information in a short period of time
using the FTF techniques?
First question first. I don't know that I'm certain that there is a high
correlation between data that is fast and data that is useful/valid, i.e.,
that is speaks to the issues at hand. I'd like to be able to proclaim that
speed=quality and I'm certain it happens from time to time, but I can't
promise that speedy data,the opposite of analysis-paralysis, leads to high
quality information. Speed likely correlates with permission to study
prior to action. That in itself is a step in the right direction.
Second question. In FTF I tried to convince performance professionals to
trust focused, swift study, fly bys, perhaps a handful of interviews and an
examination of randomly selected work products. I'm urging iterative
analysis, waves of analysis that are narrowed by contact with sources.
I labored to make the case for guided and quick efforts in comparison to
doing nothing except saluting and doing whatever the customer asked for.
In most cases we learn wondrous things from taking a fresh, irreverent and
yes speedy look that involves sources beyond the 'training' group or the
customer. Subsequent analyses, focused on the reengineering of processes
or new jobs or the adaptation of a class, for example, would provide more
depth and detail.
"I teach a graduate level introductory course in Instructional Design at a local university. As it is a part of the EdM and EdD programs in the school of education my classes are mixed between educators, K-12 and Higher Ed, and people who, like myself, are ID practitioners in business, industry, and government. Teaching front end analysis has always been difficult when dealing with educators, as many of them never face the sort of situations we IDs face.
There was an NSPI Journal article a few years back in which you and Roger Kauffman were interviewed by Hirumi and asked to apply your approaches to K-12 public education. I have my students read that, but there is still a problem for me getting through to the K-12 folks.They accept that it needs to be done, but at the same state that they will never, ever have to do it in their jobs. I've wracked my brains trying to collect or come up with examples of where they might be performing such an analysis under a different label. Any thoughts?"
"We enjoy the same mixed bag of students in our graduate program. And the question does arise. Can teachers be expected to do analysis? Heck, can they be expected to do instructional design? Bob Reiser and Walt Dick at FSU, and others, have worked on this for many moons.
I don't think that most educators will be able to do full blown analyses. Let's be realistic here. What I do think is that they can study drivers and anticipate obstacles in ways that enable them to increase the relevance and transfer of their teaching efforts. No, they can't pick most of the
content/topics for their efforts. States and boards and tests do that. But they can look at the factors that impinge. They can use data to make a good case for systemic approaches.
Many of the educators I work with soon find themselves engaged in staff development and leadership. Others are soon working with parent groups. Then performance analysis become germane and immediately so. How else to define the effort? Chapter 9 in First Things Fast has an example that is relevant to schools. It's the way educators can be strategic about their excitement regarding technology.
I wish you good luck."
"How do you measure the success of the performance analysis?"
"The performance analysis should do several things for you. Ask yourself,
once you've finished with it and used the results, if the following are
true or not:
- I have a much better fix on the situation.
- I've involved others, besides myself and the immediate customer, in
figuring out what to do.
- I am able to talk about this situation with the customer in a way that
provides him or her with something they didn't know, something tangible and
- I know how we ought to handle this situation or this group of people.
- My customer would agree that he or she now has a better sense of what to do
and who needs to be at the table to move forward.
- I can now list the people and resources that we ought to engage to move
forward and can point to some data (work product, interviews, focus groups,
quotes, opinions...) to support my recommendations.
- If I had another day or two, I know what additional data I would gather.
- Key people in the organization now know what to do and can explain the
rationale for the system I'm recommending.
- The problems and opportunitites defined and examined during the analysis
are reduced as a result of the solution system we put in place. (This
would be further on down the road, post implementation.)
Can you agree to the above statements? If the answer is yes, then I think
you're looking good on this analysis."
"How do I get customers and experts to pay attention to analysis?"
Manager, Global Educational Services,
"This is a common problem. Customers want what they want when they want it.
Rarely do they want to pause for reflection or to rethink a solution or to
gather data from others to enlighten the effort.
Still, some amount of examination is the right thing to do.
It's naive to assume that any one source, even the leader, has a
sufficiently robust view of the work, worker and workplace to know what to
do. During performance analysis we swiftly seek a fresh view so that we can
customize and tailor the solution.
I'd focus there. While customers are hesitant about analysis or study or
data gathering, they tend to be fond of customization. They want a
solution just for them. Focus there. Sell that aspect of the effort. Ask
questions that help you tilt the effort to reflect their circumstances.
Seek access to work products and data that help you see where the effort
should be directed. And promise and deliver speed."