Measuring Our Members' Success

In the last 20+ years we have helped thousands of people from all over the world to enjoy MASSIVE success in combating their stutters. The majority of our members enjoy remarkable changes in their lives, but no programme or method works for everyone, and we are very transparent with our annual statistics and our new members' success metrics.

Twice a year, as part of our ongoing quality control, Primary Coaches and Regional Directors (RDs) submit reports on all their students to Dave McGuire.  Each year one or more Evaluation Auditors (EAs) verify and audit those members who have been rated successful by Coaches and RDs.  Since 2003 we have rated those who we could NOT contact as unsuccessful.

It is important to note that our evaluation criteria are very strict and our process is very rigorous.  At least two members of our staff evaluate each new member 12-18 months after they join the Programme, and these evaluations are consistently lower than the perceived success of members' own self-assessments.  Each year we use this valuable information to make continual improvements to the Programme, ensuring that we offer our members the most comprehensive support possible.

The following shows our new members' total success rates over the past four years:

From 2013-2016, more than 1,650 people joined the McGuire Programme to learn to overcome their stuttering and to speak more articulately and eloquently.

70% of all new McGuire Programme members from 2013-2016 were rated successful even 12-18 months after they joined the programme.  Note that this figure rates all non-respondents as "unsuccessful," and is what the McGuire Programme treats as its true overall success rate.

Unlike the McGuire Programme, many other programs do not include non-respondents in their success rate calculations.  Using this method, 85% of new McGuire Programme members would have been rated successful from 2013-2016. 84% of all new McGuire Programme members from 2013-2016 attributed an increased Quality of Life to the McGuire Programme and their improved speaking abilities.

View our most recent Evaluation Audit, for 2016, here:  2016 Evaluation Results

Click here to view a description of our evaluation criteria and scale.

It should be noted that many other intensive programmes for those who stutter only measure success via self-assessments, and only factor in those who respond.  Some other programmes measure success by assessing in the clinic and/or on the last day of an intensive course.  We consider these evaluation processes to be flawed and inaccurate for the following reasons:

  • Self-assessment surveys give useful information, but do not give a reasonably accurate indication of success.  In our experience, someone may report themselves as 'satisfied,' 'successful,' or ‘improved’ even though they are still using avoidance mechanisms and blocking regularly.  For that reason, we have multiple people evaluate each new member, and we only count the assessment by an experienced evaluator/auditor in our official success rate.

     
  • Basing a success rate on only survey responses does not give an accurate indication of success.  For example, if a programme only receives a 50% survey response rate, the evaluator does not know how the 50% who did not response are doing.  In our experience, those who do not return calls for evaluation purposes have gone back to old avoidance behaviors, including being afraid to talk to an evaluator, and have returned to out of control stuttering.  Therefore our evaluation base is the TOTAL number of members that joined our programme during the target year.  Those not responding are recorded and factored in as 'not-successful' unless there is a very good reason to evaluate them otherwise.

     
  • Evaluations done during or right after an intensive course are not a good indication of success. The clinic (or, with McGP, the conference room) quickly becomes a comfort zone and there is the dynamic of a 'honeymoon' period immediately after a course ends where fear is very low and confidence high. Therefore, our evaluation audits are done anywhere from 9 to 18 months after a member's first course, which gives us an accurate indication of 'long term' success in the real world.  If we evaluated immediately after their first courses, our success rates would be close to 100%.

     
  • Any success evaluation should have some kind of 'check and balance' verification procedure to ensure accuracy.  With very few exceptions, our Evaluation Auditors live outside the audited regions they are auditing, thus decreasing the chance of undo influence or bias from others in the region.

However, to be able to compare ourselves to other programmes for those who stutter, starting with the Evaluation Audits from 2014, we now also include statistics based on:

  • Audited success rates excluding those who EAs were unable to contact (i.e. success as a percentage only of those responded to our Auditors).  This is similar to the evaluation procedures of other programmes.
  • Self-assessed improvement in Quality of Life (QOL).  This is based on the comments from Primary Coaches, Regional Directors, and/or Auditors and positive answer to the question: “Do you see yourself as successful?”

Past Years' Evaluation Details: