Quality Management

Annual Monitoring Report for the Academic Year 2009/2010

Introduction

The report’s purpose is to demonstrate the:

  • Very high quality of the programme
  • Extent to which APAC measures the learning experience of participants
  • Degree to which the courses produce safe and effective practitioners
  • Experience that future participants are likely to have
  • Transparency of APAC’s training activities so that prospective participants can make an informed choice about their training options
  • Evidence that investment of time and money by an individual or by an employer is very worthwhile

Index to this Quality Monitoring Report:

Introductory Explanation

This report is a summarized version of the formal 2009/10 annual review report submitted to: 

  • APAC’s collaborative academic partner Canterbury Christ Church University (CCCU)
  • Play Therapy United Kingdom (PTUK) – the organization providing a professional infrastructure for play therapy in the UK
  • Play Therapy International (PTI) – responsible for implementing the international standards of play therapy training and practice standards

How we measure the quality of our training:

APAC uses Play Therapy International (PTI)’s and PTUK's  standards recommendations for evaluating training.  This requires a four level approach based on the Kirkpatrick model.

Reaction – how the trainee responds to the training content, methods and trainers – this is assessed through the use of ‘Happy?’ questionnaires normally at the end of the course.  APAC uses these for every theme, or block of training as well as at the end of the course.  They enable us to react quickly to any problems that might arise.  Although it is important that the facilitators build a good rapport with the trainees, this is only the start!

Learning – what have the participants learnt? Courses that are accredited by an educational institution such as a university will have formal methods of assessing how much has been learnt.  This may be by exam and/or by written assignments.  APAC uses the latter, fulfilling CCCU’s standards and criteria for marking at post graduate level, which are also determined by the QAA – the UK body responsible for the quality of higher education.  The internal marking of what is known as summative assignments is moderated by an External Examiner appointed by the University and also by an IBECPT audit.  The vast majority of training organisations stop at this level.

Behaviour – how has what has been learnt changed the individual’s behaviour?  APAC subscribes to the view that ‘what you can do’ is more important than ‘what you know’ in terms of play therapy practice, whilst acknowledging that the ‘doing’ has to have a sound basis of knowledge.  This is why we measure the application of knowledge through the changes in participants’ behaviour in their practice work, supporting activities and in their personal development.

Results – the measurable outcomes that have been achieved as a result of the learning and changes in the trainees’ behaviour.  APAC’s purpose is to train safe and effective play therapy practitioners – so we measure the clinical outcomes of their work.  We have adopted the standard psychometric instrument used to measure the mental health of British children (the Goodman’s SDQ).

Most other play therapy training providers use only one level (reaction), a few, if accredited by a University, use two.   Only APAC uses all four levels.  This particularly important for a practice oriented course.

Basis of this report:

This report is based on the evaluations from and of 365 participants, (from 23 cohorts, 14 Post Graduate Certificate and 9 Post Graduate  Diploma courses), who have finished their taught modules within the 2009/2010 academic year.  The statistical results are therefore reliable because of this large number of trainees,  our programme being the largest play therapy training programme in the world.

It is a sub-set of a larger report, submitted to CCCU and the IBECPT, which contains further detail, some related to individual persons and whose rights to confidentiality we are protecting.

In the main report we generally use the scales  required by the University and IBECPT, ranging from 1 = ‘poor’ to 5 = ‘excellent’.  In this version we have converted these figures to more familiar percentages eg a score of 4.25 is equivalent to 85%.

 

Summary of Findings

The main findings this year are:

  • Participant satisfaction has again increased year on year and has now reached a very high level. The overall rating of the course by the participants using a scale of 1 – 5 is 4.58 – equivalent to 91.60%.  This has gradually increased over the last three years and has now exceeded our target of 4.5. (90%)  Contributing to this is an increase in the ratings of the individual Course Directors and Tutors from an average of 4.46 (89.2% to 4.58 (93.4%) - for more details see reaction. 92% of participants would recommend the programme, to others, without qualification. 97%, as compared to last year’s 98%, of the participants had their expectations for each theme/module met or exceeded. In addition 98% of the participants had their aims met at each theme/module.
  • The average of individual subject ratings by the participants is 4.38 (87.6) as compared to last year’s 4.39 (87.8). 
  • Satisfaction with learning/teaching methods at 4.58 (91.60) has remain the same as last year.
  • Three new factors have been evaluated this year to further align APAC’s methods with CCCUropolitan University’s (CCCU):  The availability of tutor support – 4.67 (93.4%); the quality of constructive feedback – 4.71 (94.2%) and the quality of support and supervision – 4.17 (83.4%)
  • The programme is having a positive effect in professional terms. 86% of the responses indicate a beneficial impact on their career. (See also changes to participants)

The quality of learning as assessed by the marks of written assignments, reflects the stringent marking standards set by CCCU.  The high quality of the work has been confirmed by the External Examiner.  The quality of the experiential work has remained constant. (For more details see learning)

Written Assignments

  • This year’s overall average participant mark of written assignments is 67.56% - just short of 'distinction'.
  • There is a high degree of consistency by markers as shown by the standard deviation of 2.31%

Experiential Work

  • The overall average mark has remained constant at 3.49 (69.8%)
  • A lower percentage of participants – 2.34% as compared to 3.82% last year required special support, including personal therapy, because their experiential work on some aspect was below the required standard for safe practice

Results in Practice.  This is a practice oriented programme, so the ultimate criterion of success is the safety and effectiveness of the participant’s clinical work.

The consistency of play therapy’s effectiveness, when the PTI standards of practice and training model as taught by APAC are used, is shown by the results of positive change for the last 5 years: 69%, 69.48%, 63.20%, 70.06%, 69.91%. These figures are based on all referrals – if the children assessed as ‘normal’ are taken out of these figures the positive change rises to 72%.  The proportion of children showing a positive change rises with the severity of the problems up to 84%.

For more details see results.

Looking at all the skills taught on the courses 66% of the participants intend to immediately apply them in their work.  Only 1.7% felt than any of the subjects taught were not relevant to them.

There is a high level of employer satisfaction as evidenced through the placement reports showing a  rating of 95.40%.

 

Profile of Participants

This profile is based upon the number of participants attending the cohorts included in this analysis.  It should give you a good idea of the make-up of future courses.  As you can see there is very broad age range - but very few men attend.

Gender

Gender

This Year %

2008/9 %

2007/8 %

Female

97.04%

95.38%

96.67%

Male

1.48%

1.26%

0.83%

As can be seen this pattern can be seen as constant over the years.

Age

Age

N This Year

% This Year

% 2008/9

% 2007/8

% 2006/7

20 - 29

73

22.74%

19%

19%

19%

30 - 39

110

34.27%

30%

32%

28%

40 - 49

83

25.86%

33%

23%

29%

50 - 59

50

15.58%

15%

18%

19%

60+

5

1.56%

1%

3%

3%

You can see that the programme attracts people of all ages - you'll fit in whatever your age!

This year the youngest participant was 22, the eldest 66 at the time of registration..  The average (arithmetic mean) is 38.9 years, the same as last year, the median is 36 (half of all participants below and half above this age) and the mode 29 (the most frequently occurring age). 

The chart shows the degree of consistency of the age of participants over the past four years.

Reaction of Participants

Quality is measured at two stages:  after every theme eg sand play, puppets, neuroscience, and at the end of each course.  All of the more detailed data is entered in our quality management system for analysis.  The amount of data from themes is very large (3423 forms, each with up to 10 data items).

Overall Satisfaction – How satisfied were the participants?

A crucial question is the overall satisfaction of the course.

Rating

N

%

1 (20%)

1

0.37%

2 (40%)

1

0.37%

3 (60%)

8

2.93%

4 (80%)

92

33.70%

5 (100%)

171

62.64%

 

 

100.00%

Average

91.6%

 

The overall average of 91.6% is the highest that has ever been achieved and is above our target of 90%

Satisfaction by Course

Overall figures can conceal problems with individual courses.  The more courses that are run the more likely it is that this will happen.  APAC is certainly not perfect, so we show below the figures by course type for this year - and yes - we had a ‘Doomsday’ Certificate course, which was the first one delivered following a change of content after revalidation by the University and also where we had to change the Course Director mid-course, yet overall this still received  a reasonably good rating by the participants (77.8%). 

Average rating

%

Highest

Course

Rating

%

Lowest

Course

Rating %

91.8

97.6

77.8

92.4

98.2

78.6

Course Content

‘What did you find most challenging?’ 

The courses are intensive, require a post graduate level of study and are challenging.  The responses to this question give a good indication of the areas that you will need to be aware of. We have introduced some new categories this year so that it is not always possible to make year on year comparisons.

 

 

 

This Year

2008/9

2007/8

2006/7

Topic

%

%

%

%

Personal process issues arising from the experiential exercises

30.57%

31.70%

31.07%

25%

Presenting to a group

12.08%

7.14%

 

 

 

 

 

 

Time management

11.70%

14.73%

2.82%

6%

Course work - reading

8.68%

 

 

 

 

 

 

 

 

 

Theory

7.17%

4.91%

1.13%

6%

Sandplay experiential work

4.15%

7.14%

9.60%

13%

Writing cases, essays etc

4.15%

4.91%

6.21%

2%

Personal/Family issues/Motivation

3.77%

0%

12.43%

6%

Administration/Documentation/Organisation

1.13%

0.89%

7.34%

10%

88% of the challenges the participants face may be grouped into four categories:

Personal processing and confidence issues

43%

External factors such as time management

16%

Theory and Course Content

16%

Challenges in dealing with specific therapeutic media

13%

 

88%

The challenges presented by process issues, and in dealing with specific creative arts media, 56% (55% last year) confirm that the experiential aspects of the course are working as intended.  The proportion of process issues seemed to have stabilised.  No one therapeutic medium presents a significant level of difficulty.

Of the factors external to the delivery of the course, time management is the most important – having to juggle work, family life and course requirements.

Subject Ratings

Neuroscience theory and research methods were introduced in both the Certificate and Diploma last year.  This year we have made some minor modifications to the way in which these subjects are facilitated, mainly using PowerPoint.

The next table uses the 1 – 5 rating method, rather than percentages to show the level of consistency of subject ratings by participants. You can also see the year by year improvement in most subjects.

The overall average of 4.56 (91.2%) is considered to be very satisfactory.

Participant Evaluations of Subjects  after each theme

Subject

Average Score %

Taught on

Embodiment, Projection & Role

97.60

PG Diploma

Court work

96.60

PG Diploma

Masks

96.60

PG Diploma

Clay

95.20

PG Certificate

Movement

93.80

PG Certificate

Play Therapy Dimensions Model

93.80

PG Diploma

Art/Drawing

92.80

Both Certificate & Diploma

Sand

92.80

Both Certificate & Diploma

Practice Management

92.40

Both Certificate & Diploma

An Introduction to Play Therapy

91.40

PG Certificate

Music

90.60

PG Certificate

Therapeutic Story

90.00

PG Certificate

Attachment

89.40

PG Certificate

Puppets

89.00

PG Certificate

Child Development

88.60

PG Certificate

Ethics

87.20

PG Certificate

Other

84.60

 

Total

91.20

 

This illustrates the range of subjects taught and the high level of satisfaction with each.

Learning Methods

This table is based on 1588 participant evaluations for this year.

Method

 

 This  Year %

 

2008/9 %

 

2007/8 %

 

2006/7 %

Practical / Experiential Exercises

98

96

94

88

Clinical Supervision

86

86

82

80

Facilitation  eg formal presentations

84

86

84

76

Case Studies

84

84

82

74

Reading List

82

76

76

72

Teaching of Theory

80

82

76

76

Videos

72

74

62

60

Overall Average

84

84

80

75

The data show the improvement over the years and the popularity of the experiential learning methods which not only enable participants to understand the therapeutic processes that the children will experience but also how to apply what is taught to practice in the play room in a practical way.

All methods have received a similar rating to last year, except the reading list, which has improved to a satisfactory level.  As expected with a practice based course based on experiential learning and as with previous data, the practical and experiential methods score very highly.

Relevance of course content

Participants are also asked about the relevance of the block/theme content to their work and experience.  Out of 1639 responses only 11 (0.67%) produced a negative comment.

The Tutors

The next table is based on 3178 evaluations of the Course Directors and Tutors by the participants. 

Staff No

 

This Year

%

 

 

2008/9

%

 

2007/8

%

11

99.0

96.8

96.2

45

97.0

 

 

31

96.4

95.4

97.4

35

95.8

94.0

 

48

95.8

 

 

29

94.2

90.0

90.6

5

92.2

91.4

88.8

37

91.8

97.2

 

20

91.2

95.6

89.4

36

89.2

93.8

 

28

89.2

94.2

94.4

 

93.4

94.2

 93.4

The overall quality of the faculty has again been rated very highly by the participants. Year on year the overall average has dropped very slightly, from last year, to the 2007/8 level.  The performance is consistent across the board.  The newer members of the faculty have very satisfactory ratings, partly due to the thoroughness of their training as trainers. 

It must however be remembered that a high rating in terms of participant acceptability does not necessarily correlate with teaching/learning effectiveness.

Support of the Participants by the Teaching Staff

This is the first year that we have included a question, concerning the availability of par support for the participants.

Rating

N

%

2 (40%)

4

0.22%

3 (60%)

67

3.63%

4 (80%)

467

25.30%

5 (100%)

1308

70.86%

 

1846

100.00%

The overall average of 4.67 (93.4%) is very high. 

This is also the first year that we have included a question, concerning the quality of constructive feedback from the teaching staff to the participants about their work

Rating

N

%

2 (40%)

3

0.17%

3 (60%)

49

2.71%

4 (80%)

427

23.58%

5 (100%)

1332

73.55%

 

1811

 

The overall average of 4.71 (94.2%) is also very high. 

 

Quality of Learning

Participants are assessed in two main ways: 

  1. Written assignments
  2. Evaluation of experiential work

How good is the academic work?  Evaluation of learning in 2009/10 was carried out using a new method and standards of marking, based on the University’s requirements.  The table below is based on the marks of 966 assignments.

Although the overall course marks are assessed as ‘pass’ or ‘fail’, the University’s grading criteria provide an indication of the success of the learning by type of assignment:

  • Fail = 49% or less
  • Pass = 50% - 59%
  • Merit = 60% -  69%
  • Distinction = 70% and above

No student has been failed either by the University or the Professional Organisation (PTI) this year.

Average Marks by Type of Written Assignment

Assignment

AverageTotal Marks (%)

Book review summary - Certificate

67.00

Book review summary - Diploma

65.92

Case study - Certificate

68.63

Case study - Diploma - Group

67.21

Case study - Diploma - Individual

70.79

Essay - Diploma

69.14

Overall Portfolio - Certificate

70.00

Overall Portfolio - Diploma

67.15

Practice Management

64.90

Process Diary - Certificate

64.00

Process Diary - Diploma

65.21

Project - Diploma

70.79

Average – this year

67.56

This year’s overall average of 67.56% (just below distinction grade) indicates a satisfactory performance by the participants and a tightening in the Internal Examiners’ standards, as requested by the University.

There is also consistency between assignments ranging from a low of 64.00 for the Certificate process diary to a high of 70.79 for the Diploma individual case study and also the Diploma project. 

How safe are the participants to practice therapeutic play and play therapy? - Rating of Experiential Work

The assessment of the participants’ experiential work through the process diary and the observation by the facilitating staff which is crucial.  If this aspect is not satisfactory it will not be possible to achieve a pass for a professional practice award, however good the written academic work.  Each participant is rated a number of times throughout the course on their work during the experiential exercises on a scale of 1 – 5, where 2 or below needs attention.  Any one receiving a score of 2 or below is notified and given additional help on how the problem(s) might be addressed.

The summary table below is based on 1817 Course Director/tutor evaluations of the participants’ experiential work, by criterion.

 

Skills %

 

Know-ledge %

 

Empathy %

 

Authen-ticity  %

Open-ness %

 

Self Aware-ness %

 

Group Aware-ness %

 

Total Rating %

This Year

67.8

67.6

70.4

72.2

71.2

71

70.2

69.8

2008/9

68.4

69

70.4

72.6

72.2

70.4

69

70.2

2007/8

71.2

70.4

72.8

74.2

75.2

74.8

73.2

73.2

2006/7

61.8

60.8

63.6

65.6

69.8

68.2

69.8

65.6

Although the overall rating has fallen slightly this year from 70.2% to 69.8%, these results are considered to be satisfactory and consistent with previous years.

44 (2.34%) of the participants were rated 40% or below on any single factor (ie potentially unsafe), requiring Course or Programme Director intervention. The percentage is down from the 3.82%) of last year.

 

Changes to Participants

This (Kirkpatrick Stage 3) is the most difficult stage to measure quantitatively for two main reasons:

  1. It may be more difficult for some of our participants to apply their learning in their job environment than for others
  2. It is impossible to set a benchmark for each participant against which change can be measure

We examine the impact of the programme on participants’ jobs and also upon their careers.

Impact on their jobs

After each theme/block the participant is asked to comment upon how her/his learning experience of this skill will impact upon their job.  We have 3249 free text records relating to the courses included in this review.

Response to Training

This Year

2008/9

2007/8

Will definitely try to apply the new skill

66.29%

47%

64.25%

Might try apply the new skill

9.85%

20%

16.06%

Appreciated how the new skill might be used

19.70%

30%

18.65%

Reminded of how the skill might be used

2.46%

3%

0.52%

No new relevant skill learnt

1.70%

0

0.52%

 

100%100%100%

The data show that new useful skills are being acquired by the participants at a satisfactory level.

Impact on their careers

In measuring changes in behaviour as a result of the training participants are asked ‘What has been the main impact of the course on your career and self ‘.  The responses are provided in free text form.  These are then coded under the main categories listed in the table below.

Category

Definition

This Year

2008/9

2007/8

2006/7

%

%

%

%

Career change

Aided future change of career to a Play  Therapist

38.68%

30%

37%

37.33%

Opened up more career opportunities

10.38%

12%

11.56%

12.44%

Job performance improvement

Large/huge positive influence on existing job  - new methods of working

5.66%

17%

8.09%

10.44%

Positive influence on existing job - new methods of working

31.13%

31%

27.75%

26.44%

Personal development

Personal development

14.15%

10%

15.61%

10.44%

 

 

100%

100.00%

100.00%

97.09%

The data and table show that the programme is having a very positive effect in professional terms for the participants because 86% of the responses reported a beneficial impact on their career in some way.  The figures are similar to previous years’. The overall result accords with the purpose of the programme which is to produce safe and effective play therapy practitioners.

 

Results in Practice

It is our view that the most important criterion in evaluating the effectiveness of a practice based course at Certificate and Diploma level is the safety and effectiveness of the clinical work with the children ie job performance.  APAC has a big advantage in its close links with PTI and its affiliated professional organisations such as PTUK, PTIrl, PTAu, PTHk, PTNz etc in the exchange of information worldwide  – practice activities and results which inform training effectiveness and content. We also measure results in terms of employers'satisfaction by means of placement reports.

Clinical Outcomes

We have carried out a review, in conjunction with PTI, of the clinical outcomes data for the last eighteen months.  This includes input from both Certificate and Diploma level participants – some at the start of their course some near the end.  The table below shows the number of clients by extent of change, as reported by referrers and parents.

Change in Total Difficulties – Combined Referrer and Parent

N = 1686 pairs (ie individual cases) of pre and post therapy  SDQ (Strengths and Difficulties) Questionnaires relating to participants on this year’s courses.  The table shows the change experienced by the children, following therapy, as recorded by the referrers and parents.

 

Negative

No Change

Positive

Total

 

  EN-US;layout-grid-mode:line">N

  EN-US;layout-grid-mode:line">%

  EN-US;layout-grid-mode:line">N

  EN-US;layout-grid-mode:line">%

N

 

%

 

  EN-US;layout-grid-mode:line">N

Referrers

212

22.77%

89

9.56%

630

67.67%

931

Parents

153

20.26%

69

9.14%

533

70.60%

755

Combined

365

21.65%

158

9.37%

1163

68.98%

1686

This year’s rounded average of 69%, showing a positive change, compares to an overall average of 70% for the previous five years – a slight drop. Negative changes are typically due to changes in the child's environment whilst they are undertaking therapy and uncovering deeper issues during therapy (in which case a futher episode of therapy is normally prescribed). It is concluded that the participants’ job performance remains at a satisfactory level, and is within PTI’s guidelines.

Placement Reports

Another measure of job performance is obtained from the placement reports. This depends upon the co-operation of the placement organization, which is not always forthcoming. Employers/commissioners of the participants are asked to rate the participants activities in the placement under seven headings using a scale of 1 – 5 where 1 is poor and 5 is excellent.  (These have then been converted into percentages in the next table.)

Attend-ance % 

Punct-uality % 

Facilities 

Left in Good Order %

Commun-ication With Staff %

Object-ives Met % 

 

How Well the Placement Worked %

This Year

97.2

98.4

97

93.8

93.2

93.2

2008/9

98.2

98.2

98.6

96.6

91.4

96.6

2007/8

97.6

97

97.4

94.4

92.4

94.6

2006/7

96.4

96.6

95.6

93

93.2

95.4

Some of the ratings are marginally down on the previous years, but still show a high overall employer satisfaction rating of 95.4%.