Success Evaluations
Measuring our members' success.
Over the last 26 years we have helped thousands of people from all over the world to enjoy success in conquering their stutters. The majority of our members enjoy remarkable changes in their lives, but no programme or method works for everyone, and we are very transparent with our annual statistics and our new members’ success metrics.
Yearly, as part of our ongoing quality control, our new students’ Personal Primary Coaches and Regional Directors (aka RD) submit reports on their students directly to Dave McGuire. Each year one or more Evaluation Auditors (aka EA) verify and audit our members who have been successful rated by our coaches and Regional Directors. Since 2003 we have rated those who we could NOT contact as unsuccessful.
It is important to note that our evaluation criteria are very strict and our process is very rigorous. At least two members of our staff evaluate each new member 12-18 months after they join the Programme, and these evaluations are consistently lower than the perceived success of members’ own self-assessments. Each year we use this valuable information to make continual improvements to the Programme, ensuring that we offer our members the most comprehensive support possible.
The following shows our new members' total success rates over the past five years - 2014 to 2018
It should be noted that many other intensive programmes for those who stutter only measure success via self-assessments, and only factor in those who respond. Some other programmes measure success by assessing in the clinic and/or on the last day of an intensive course.
We consider these evaluation processes to be flawed and inaccurate for the following reasons:
- Self-assessment surveys give useful information, but do not give a reasonably accurate indication of success. In our experience, someone may report themselves as 'satisfied,' 'successful,' or ‘improved’ even though they are still using avoidance mechanisms and blocking regularly.
- For that reason, we have multiple people evaluate each new member, and we only count the assessment by an experienced evaluator/auditor in our official success rate.
- Basing a success rate on only survey responses does not give an accurate indication of success. For example, if a programme only receives a 50% survey response rate, the evaluator does not know how the 50% who did not response are doing. In our experience, those who do not return calls for evaluation purposes have gone back to old avoidance behaviors, including being afraid to talk to an evaluator, and have returned to out of control stuttering.
- Therefore our evaluation base is the TOTAL number of members that joined our programme during the target year. Those not responding are recorded and factored in as 'not-successful' unless there is a very good reason to evaluate them otherwise.
- Evaluations done during or right after an intensive course are not a good indication of success. The clinic (or, with McGP, the conference room) quickly becomes a comfort zone and there is the dynamic of a 'honeymoon' period immediately after a course ends where fear is very low and confidence high.
- Therefore, our evaluation audits are done anywhere from 9 to 18 months after a member's first course, which gives us an accurate indication of 'long term' success in the real world. If we evaluated immediately after their first courses, our success rates would be close to 100%.
- Any success evaluation should have some kind of 'check and balance' verification procedure to ensure accuracy. With very few exceptions, our Evaluation Auditors live outside the audited regions they are auditing, thus decreasing the chance of undo influence or bias from others in the region.
However, to be able to compare ourselves to other programmes for those who stutter, starting with the Evaluation Audits from 2014, we now also include statistics based on:
- Audited success rates excluding those who EAs were unable to contact (i.e. success as a percentage only of those responded to our Auditors). This is similar to the evaluation procedures of other programmes.
- Self-assessed improvement in Quality of Life (QOL). This is based on the comments from Primary Coaches, Regional Directors, and/or Auditors and positive answer to the question: “Do you see yourself as successful?”