Education in the Knowledge Society 22 (2021)

Teacher Training for Effective Teaching

Formación del profesorado para una labor docente eficaz

Francisco José Melara-Gutiérreza, Ignacio González-Lópezb

a Institute of Education, Health and Social Sciences; University of Chichester, United Kingdom

https://orcid.org/0000-0001-8347-8090F.MelaraGutierrez@chi.ac.uk

b Departamento de Educación, Universidad de Córdoba, España.

https://orcid.org/0000-0002-9114-4370ignacio.gonzalez@uco.es

ABSTRACT

This paper identifies the training needs of primary and secondary school teachers related to their daily work so that it may be understated as effective professional practice. To this end, a questionnaire has been compiled, completed by teachers from Spain, China, and South Korea, to determine the discrepancies between the aspirational ideal and the actual reality of classroom instruction, with a view to achieving quality teaching. The shared training requirements detected among the informants pertain to learning goals, the curriculum, expectations, autonomy, and formative and responsible evaluation.

Keywords:
Effective teaching
Teacher training
Needs analysis
Quality education

RESUMEN

Este trabajo identifica las necesidades formativas del profesorado de educación primaria y secundaria relacionadas con su labor diaria para ser entendida como una práctica profesional eficaz. Para ello, se ha construido un cuestionario que permite determinar la discrepancia existente entre lo que es la realidad de actuación en el aula y lo que debería ser para el logro de una enseñanza de calidad, cumplimentado por docentes de España, China y Corea del Sur. Los resultados revelan como demandas formativas comunes aquellas referidas a los objetivos de instrucción, el currículum, las expectativas, la autonomía, y la evaluación formativa y responsable.

Palabras clave:
Enseñanza eficaz
Formación del profesorado
Análisis de necesidades
Calidad educativa

1. Introduction

Effective teaching refers to the actions undertaken by teachers to achieve long-lasting holistic student development, which is greater than might have been expected given their previous academic performance and the social, economic and cultural situation found in their everyday surroundings (Murillo, 2005).

In the supranational research carried out by Román (2010) on models of school effectiveness in schools in Bolivia, Chile, Colombia, Cuba, Ecuador, Panama, Peru and Venezuela, the following factors were found to be defining characteristics: 1. Classroom environment based on the establishment of good relationships between students and teachers in the classroom. 2. Teaching methodology characterised by varied and fun activities, taking into account the diversity of the students and basing assessment on the students’ level of performance. 3. Time management in the classroom to maximise the effective time allocated to teaching and learning. 4. Planning of teaching focused on student learning outcomes. 5. Educational infrastructure and resources to support and motivate student learning. 6. Participation of families in their children’s school activities.

In their review of international studies on effective teaching, Ko, Sammons & Bakkum (2014) state that this term was created to refine the definition of much broader notions such as “good education” and “quality education”. In their conclusions, they define the work of teachers, so that it may be considered effective, in terms of student outcomes and classroom teaching behaviour. They suggest that effective teaching should be based on: having clear instructional objectives; possessing expert knowledge of the curriculum content and the strategies for teaching it; communicating and conveying to students what is expected of them and why; making expert use of existing instructional materials to spend more time on practices that enrich and clarify content; knowing students in depth, adapting instruction to their needs, and anticipating erroneous ideas in their prior knowledge; teaching students metacognitive strategies and giving them opportunities to master them; addressing higher and lower level cognitive objectives; assessing students' understanding, providing appropriate feedback on a regular basis; integrating instruction with that of other subjects; and accepting responsibility for student outcomes.

Mclaughlin (2013) found that the different education systems around the world show that the most important factor in determining student success is the quality of the teaching staff, ensuring that they experience effective professional development throughout their career, with opportunities to observe and work with other colleagues, and where lifelong learning is considered an added value to their work.

In their recent work, Casillas, Cabezas & García-Peñalvo (2020) remind us that teachers play a major role in all educational reforms and innovations, since they are the ones in charge of adapting their classrooms to whichever elements they are offered. Training will be required to ensure teachers develop the characteristics of what is meant by effective teaching under any conditions. Training plans must be based on those needs perceived by teachers, seeking to fulfil an unmet and essential condition that allows them to function under normal conditions and achieve their goal (Gairín, 1996).

Detecting these training needs so that professional endeavours may be considered effective is essential in order to redirect the essential processes of change and close the gap between the current and desirable teaching situation. In this regard, Kaufman (1982, p. 73) defined needs assessment as a formal process for determining gaps between present and desired outcomes, where we are currently and where we should be.

Herreras (2007) explains that needs analysis should be the first stage in any study aimed at implementing programmes or services.

Huerta (2003) states that this type of analysis should first address a range of specific needs:

• Needs felt by the community and recognised by the agent of change.

• Needs felt by the community, but not recognised by the agent of change.

• Needs recognised by the agent of change, but not by the community.

• Needs observed by the agent of change, but absent in the community.

Thus, as noted by Kaufman and English (1979) and Huerta (2003), the following steps are required to implement a needs analysis:

1. Exploration (pre-assessment): this stage establishes the general purpose of the needs analysis and identifies areas of study, sources of information, the type of information to be collected, and the methods to be used.

2. Data collection (assessment): this stage establishes the logistics of data collection methods, and the survey administrators are trained.

3. Use (post-assessment): finally, priorities are established, and alternative solutions are determined. In addition, a plan is developed to implement the solutions, the needs analysis is evaluated, and the results are communicated.

The results presented in this paper are the outcomes of the initial stages of exploration and data collection, which were subsequently used to design training programmes according to the needs for change detected in each participating teaching group.

2. Methodology

The aim of this research is to develop a tool that identifies the educational needs perceived by teachers in relation to their daily work, in order to understand their teaching performance as effective professional practice. This paper addresses the concept of need based on the contributions of Kaufman (2006), who understands need as a gap between current results and desired results. Hence, collating these gaps will provide the necessary information and the order of priority to design a training plan to work on the specific areas required to achieve the stated teaching goal.

The research design used is based on a non-experimental and descriptive methodology, which seeks to develop an in-depth and exhaustive understanding of a singular reality (Arnal, Del Rincón & Latorre, 1992). Specifically, the methodology uses a survey-based approach, responding to problems both in descriptive terms and in relation to variables, thus ensuring the rigour of the information collected (Galindo, 1998).

The target population is made up of primary and secondary teachers of English who attended a foreign language teaching methodology training course delivered in the summer of 2019 at the University of Chichester (United Kingdom). This training lasts from three to seven weeks (25 hours per week) and focuses on curriculum design, teaching modes, evaluation systems, and motivation.

The total number of participants in the study was 84 professionals from Spain (48.8%), South Korea (28.6%) and China (22.20%); countries that have signed inter-institutional arrangements to receive the training described. The number of participants in the study is valid for the intended purposes, as stated by McMillan & Schumacher (2006).

In order to respond to the proposed objectives, and due to the difficulty of finding an instrument that would ensure that the expectations of the study are met, an ad hoc questionnaire was designed. This was implemented during the first few days of the course as part of the needs analysis process that outlines the training elements the programme will work on with the teachers during their stay in the UK. The results obtained have been used to identify, on the one hand, the guarantees of reliability and validity provided by the same and, on the other, the educational needs of this group.

The instrument encompasses ten elements from the work developed by Ko, Sammons and Bakkum (2014), who identified, by means of a meta-analysis based on scientific evidence, the elements that define the effectiveness of teaching work in terms of student outcomes, teacher behaviours, and classroom processes. These have been transformed into observable behaviours, so that they are identified as expected actions or tasks at different levels of implementation.

To design the structure of the tool, the scale format present in Kaufman (2006) was used, which allows us to obtain data to identify the gaps (need) between the current perceived reality and the expected reality for the achievement of a given objective. The structure involves placing the elements of analysis in a central space and subjecting them to a double process of opinions expressed by the teachers along a five-point scale, in relation to how that reality is (“describe how you see yourself currently operating in your teaching role”) and how it should be, (“describe how you think you should be operating in your teaching role) where 1 means rarely, 2 occasionally, 3 at times, 4 often, and 5 consistently.

3. Results

Before the results obtained were validated, and due to the small size of the participating sample (N=84), we sought to verify that the variables were normally distributed. If we look at Table 1, the values provided by the coefficients of asymmetry (<3.00) and kurtosis (<8.00) show that there is univariate normality in the data obtained (Thode, 2002). Likewise, the goodness-of-fit of the statistical model underlying the observations made and those considered desirable has been established, assuming a discrete character in the scaled values by means of the chi-squared test (n.s.=.05) (Rao & Scott, 1981). The comparison has proved to be significant in all the items proposed for each of the two subscales considered.

Item evaluated

Asymmetry

Kurtosis

Goodness-of-Fit

Coeff.

St. Error

Coeff.

St. Error

χ2

p

How it is

1. Has clear instructional objectives.

-0.527

.263

-0.370

.520

23.619

.000

2. Expertly understands the content of the curriculum and the strategies for teaching it.

-0.538

.263

0.346

.520

50.000

.000

3. Communicates to students what is expected of them and why.

0.073

.263

-0.571

.520

29.571

.000

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

-0.470

.263

0.563

.520

60.286

.000

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

-0.539

.263

-0.499

.520

26.238

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

-0.057

.263

-0.401

.520

41.952

.000

7. Addresses higher and lower-level cognitive objectives.

-0.259

.263

-0.509

.520

29.690

.000

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

-0.060

.263

-0.552

.520

41.357

.000

9. Integrates instruction with teaching of other subjects.

-0.073

.263

-0.663

.520

28.381

.000

10. Accepts responsibility for student outcomes.

-0.681

.263

-0.180

.520

39.810

.000

How it should be

1. Has clear instructional objectives.

-2.068

.263

5.636

.520

69.143

.000

2. Expertly understands the content of the curriculum and the strategies for teaching it.

-3.356

.263

13.531

.520

54.000

.000

3. Communicates to students what is expected of them and why.

-1.002

.263

0.622

.520

59.333

.000

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

-0.281

.263

-0.336

.520

28.024

.000

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

-1.428

.263

2.722

.520

56,286

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

-1.133

.263

0.091

.520

37.500

.000

7. Addresses higher and lower-level cognitive objectives.

-1.484

.263

1.878

.520

74.095

.000

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

-1.276

.263

0.629

.520

47.786

.000

9. Integrates instruction with teaching of other subjects.

-0.869

.263

0.143

.520

33.048

.000

10. Accepts responsibility for student outcomes.

-1.266

.263

1.640

.520

75.167

.000

Source: Authors’ own.

Table 1. Fit of the measurements obtained.

3.1. Reliability and validity of the measurement tool

The accuracy of the data obtained with the questionnaire and the stability of the measurement given in different applications of the questionnaire is one of the basic elements that must be fulfilled by the instrument designed (Lincoln & Guba, 1985). This is why the information collected must take account of a number of factors that guarantee its scientific veracity and will not compromise the study (Nunnally & Bernstein, 1994), that is, they must be reliable and valid, consistent over time, and offer contributions relating to the measurement construct.

Reliability is impaired when the questions are not clearly formulated and lead to different interpretations by the persons surveyed, and the internal consistency of the instrument must be assessed in order to give significance to the items included in the test, in other words, ensuring that each one measures a portion of the trait or feature we wish to study (Cohen, Manion, & Morrison, 2011). For its part, validity depends on whether questions relate to facts or opinions, attitudes and other dimensions that are not directly observable. This will be assessed by analysing the discriminatory capacity of the items so as to strengthen the one-dimensional nature of the test (Ebel & Frisbie, 1986).

The procedure used to determine reliability is Cronbach’s Alpha method, based on the average inter-item correlation. Upon initial evaluation of the results obtained (see Table 2), we see that the values corresponding to each of the scales described (alpha values above .6) indicate that the relationships between the different items included in the measurement tool, according to this criterion, are very high (Jisu, Delorme, & Reid, 2006). The total Alpha value (.759), considering the two constituent scales of the same as one unit, indicates a high correlation and a high level of stability in the responses. Therefore, the questionnaire offers guarantees of reliability.

Scale

Alpha Coefficient

N

How it is

.656

10

How it should be

.630

10

Total

.759

20

Source: Authors’ own.

Table 2. Alpha coefficients for the scales.

The behaviour of each of the instrument items (scalar items) reveals homogeneity indices all with values greater than .15 and a positive sign, so each item measures a portion of the trait to be studied, and therefore the instrument is reliable (Henson, 2001), with the exception of number 4 on both scales, the wording of which should be revised so as not to compromise the reliability of the instrument (see Table 3). This is confirmed by observing the Alpha coefficient if we remove item 4, which would improve the overall reliability of the test (>.759).

Item evaluated

Corrected homogeneity coefficient

Alpha coefficient if the item is eliminated

How it is

1. Has clear instructional objectives.

.198

.752

2. Expertly understands the content of the curriculum and the strategies for teaching it.

.334

.749

3. Communicates to students what is expected of them and why.

.155

.753

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

-.083

.776

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

.500

.734

6. Teaches students metacognitive strategies and provides opportunities to master them.

.399

.744

7. Addresses higher and lower-level cognitive objectives.

.524

.732

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

.419

.742

9. Integrates instruction with teaching of other subjects.

.503

.734

10. Accepts responsibility for student outcomes.

.395

.744

How it should be

1. Has clear instructional objectives.

.243

.755

2. Expertly understands the content of the curriculum and the strategies for teaching it.

.254

.754

3. Communicates to students what is expected of them and why.

.263

.753

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

-.127

.786

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

.585

.732

6. Teaches students metacognitive strategies and provides opportunities to master them.

.301

.751

7. Addresses higher and lower-level cognitive objectives.

.318

.750

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

.470

.744

9. Integrates instruction with teaching of other subjects.

.572

.731

10. Accepts responsibility for student outcomes.

.419

.743

Source: Authors’ own.

Table 3. Behaviour of model items.

Subsequently, the content validity of the instrument items was estimated by finding the discriminatory power of the items included in the scales. An item has discriminatory power if it is able to distinguish between participants who gain a high score on the test and those who gain a low score, that is, if it discriminates between those with a high and low level in the measured range (Ebel & Frisbie, 1986).

To carry out this study, closed ordinal choice items (evaluation scale 1 to 5) were selected from the two subscales so that they were recoded into three groups (Low, Medium, and High):

• 1 = Low group (minimum value, percentile 33): (“How it is” scale: 20, 33); (“How it should be” scale: 31, 42)

• 2 = Medium group (percentile 34, percentile 66): (“How it is” scale: 34, 36); (“How it should be” scale: 43, 45)

• 3 = High group (percentile 67, maximum value): (“How it is” scale: 37, 47); (“How it should be” scale: 46, 50)

Performing Student’s t test (n.s.=.05) for independent samples allowed the values of the grouping variable (1 and 3) to be re-coded, and the difference between the groups that score low and high in the items could be established (see Table 4). All p-values less than .05 represent high discriminatory power on the part of the item. P-values equal to or greater than .05, on the other hand, do not allow the item to discriminate, so the item should be reviewed (Morales, 2006). In this case, all items, except number 4 in both subscales, meet the objectives set for each of the questions, reflecting the existence of an internal structure in the questionnaire capable of responding to the demands raised.

Item evaluated

Medium low

Medium high

t

p

How it is

1. Has clear instructional objectives.

2.97

3.88

-2.947

.005

2. Expertly understands the content of the curriculum and the strategies for teaching it.

3.81

4.40

-3.395

.001

3. Communicates to students what is expected of them and why.

2.58

3.40

-3.182

.002

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

3.42

3.40

0.082

.935

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

2.84

4.36

-6.006

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

2.87

4.00

-5.041

.000

7. Addresses higher and lower-level cognitive objectives.

2.68

4.24

-7.399

.000

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

3.06

4.20

-5.183

.000

9. Integrates instruction with teaching of other subjects.

2.32

3.80

-6.036

.000

10. Accepts responsibility for student outcomes.

3.32

4.44

-4.662

.000

How it should be

1. Has clear instructional objectives.

4.13

4.70

-2,487

.016

2. Expertly understands the content of the curriculum and the strategies for teaching it.

4.50

5.00

-2,884

.006

3. Communicates to students what is expected of them and why.

4.07

4.85

-4,759

.000

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

3.40

3.59

-0,721

.474

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

3.83

4.81

-4,990

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

4.10

4.96

-5,740

.000

7. Addresses higher and lower-level cognitive objectives.

4.03

5.00

-6,205

.000

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

4.23

5.00

-5,862

.000

9. Integrates instruction with teaching of other subjects.

3.43

4.81

-7,657

.000

10. Accepts responsibility for student outcomes.

3.70

4.70

-4,556

.000

Source: Authors’ own.

Table 4. Discriminatory power of the items in the questionnaire.

3.2. Results of applying the instrument

The data given in Table 5 indicate the perceived needs of the teachers surveyed to understand that their teaching work meets the demands of effective teaching.

To measure the magnitude of the effect of these differences (needs according to Kaufman (1982)), Cohen’s d index was used, which quantifies the effectiveness of the intervention (Coe & Merino, 2003). Cohen (1988) established that values less than .2 were understood as “small,” started to become acceptable from .2 up to .5 and were high from .8 onwards. In all cases, except for the item relating to the use of existing resources, the perception of success is always higher than the professional reality, with a very large effect size, which validates the previous expressions.

Item evaluated

How it is

How it should be

t

p

Difference of means

d

CI

Mean

SD

Mean

SD

1. Has clear instructional objectives.

3.39

1.141

4.45

0.827

-6.891

.000

-1.060

1.064

[0.741, 1.387]

2. Expertly understands the content of the curriculum and the strategies for teaching it.

4.00

0.744

4.75

0.656

-6.926

.000

-0.750

1.069

[0.746, 1.393]

3. Communicates to students what is expected of them and why.

2.96

1.011

4.42

0.698

-10.833

.000

-1.452

1.681

[1.329, 3.032]

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

3.50

0.857

3.39

1.053

0.723

.471

0.107

0.927

[0.609, 1.245]

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

3.55

1.124

4.39

0.792

-5.635

.000

-0.845

0.864

[0.548, 1.180]

6. Teaches students metacognitive strategies and provides opportunities to master them.

3.44

0.923

4.54

0.667

-8.816

.000

-1.095

1.366

[1.030, 1.702]

7. Addresses higher and lower-level cognitive objectives.

3.35

1.024

4.50

0.736

-8.393

.000

-1.155

1.290

[0.957, 1.622]

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

3.61

0.957

4.61

0.602

-8.107

.000

-1.000

1.191

[0.863, 1.520]

9. Integrates instruction with teaching of other subjects.

3.04

1.023

4.15

0.871

-7.632

.000

-1.119

1.168

[0.841, 1.196]

10. Accepts responsibility for student outcomes.

3.81

0.988

4.29

0.872

-3.312

.001

-0.476

0.515

[0.208, 0.823]

Note: CI=Confidence Interval (95%).
Source: Authors’ own.

Table 5. Mean and standard deviation of each item and difference of means between scales.

According to the participating teachers, quality teaching involves clearly determining and addressing both higher and lower order cognitive instructional objectives; expertly selecting curricular content and the best strategies for teaching it; communicating and justifying to students what is expected of them; tailoring instruction based on the individual needs of each student; providing opportunities to master metacognitive strategies; making use of formative evaluation on a regular basis; integrating instruction from different subjects; and attributing responsibility for the results obtained to the different elements involved in the educational activity. All this shows the demand for training in each of the proposed valuation items. In contrast, in the process of building quality education, the use of resources already designed for use in the classroom is not a priority according to the teachers surveyed.

It has been interesting to examine whether the participating teachers’ country of origin predicts training needs by conducting a single factor analysis of variance (ANOVA) (n.s=.05) (see table 6).

Item evaluated

South Korea

Spain

China

F

p

Mean

SD

Mean

SD

Mean

SD

How it is

1. Has clear instructional objectives.

3.46

1,351

3.44

1,074

3.21

1.032

0.310

.734

2. Expertly understands the content of the curriculum and the strategies for teaching it.

4.98

.584

4.00

.837

3.89

.737

0.335

.716

3. Communicates to students what is expected of them and why.

3.13

1,154

2.76

.943

3.21

.918

1.767

.177

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

3.63

.875

3.22

.822

3.95

.705

5.594

.005

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

3.29

1,160

4.10

.735

2.68

1.157

14.862

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

3.08

.830

3.71

.929

3.32

.885

3.945

.023

7. Addresses higher and lower-level cognitive objectives.

2.71

1,160

3.68

.934

3.42

.607

8.115

.001

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

3.42

1,139

3.66

.855

3.74

.933

0.704

.498

9. Integrates instruction with teaching of other subjects.

2.79

.977

3.27

1,049

2.84

.958

2.139

.124

10. Accepts responsibility for student outcomes.

3.42

1,060

4.10

.735

3.68

1.204

4.077

.021

How it should be

1. Has clear instructional objectives.

4.50

.978

4.49

.675

4.32

.946

0.331

.719

2. Expertly understands the content of the curriculum and the strategies for teaching it.

4.67

.868

4.83

.495

4.68

.671

0.582

.561

3. Communicates to students what is expected of them and why.

4.29

.859

4.49

.553

4.32

.769

0.593

.555

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

3.50

.885

3.22

.988

3.63

1.342

1.173

.315

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

4.08

1,018

4.73

.449

4.05

.780

8.706

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

4.38

.770

4.73

.501

4.32

.749

3.733

.028

7. Addresses higher and lower-level cognitive objectives.

4.46

.721

4.68

.521

4.16

1.015

3.561

.033

8. Evaluates student understanding, providing appropriate feedback on a regular basis.

4.54

.588

4.68

.567

4.53

.697

0.633

.533

9. Integrates instruction with teaching of other subjects.

3.79

1,021

4.49

.597

3.89

.937

6.748

.002

10. Accepts responsibility for student outcomes.

4.25

.737

4.41

.805

4.05

1.129

1.151

.322

Source: Authors’ own.

Table 6. Results of the Analysis of Variance according to country of origin.

There are five items in the “how it is” scale and four items in the “how it should be” scale in which the teachers’ country of origin predicts statistically significant differences in relation to their training needs. When measuring the reality of their teaching work, differences appear in the items related to the expert use of existing resources, attention to diversity and personalisation, the teaching of metacognitive strategies, addressing cognitive objectives at different levels, and responsible evaluation. When perceived success is measured, these differences occur within the indicators of attention to diversity and personalisation, the teaching of metacognitive strategies, addressing cognitive objectives at different levels, and interdisciplinarity, elements that must be emphasised in differential training by country, with the rest being common to all of them.

Application of Scheffé’s post-hoc multi-comparison test indicates which countries yielded the differences observed (see Table 7).

Item evaluated

I (group)

J (group)

I-J

p

How it is

4. Makes expert use of existing instructional materials to spend more time on practices that enrich and clarify content.

Spain

China

.728

.008

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

Spain

South Korea

China

.806

1.143

.008

.000

6. Teaches students metacognitive strategies and provides opportunities to master them.

Spain

South Korea

.624

.029

7. Addresses higher and lower-level cognitive objectives.

Spain

South Korea

.975

.001

10. Accepts responsibility for student outcomes.

Spain

South Korea

.681

.026

How it should be

5. Has in-depth knowledge of students, adapting instruction to their needs and anticipating misconceptions in their prior knowledge.

Spain

South Korea

China

.648

.679

.004

.005

7. Addresses higher and lower-level cognitive objectives.

Spain

China

.525

.035

9. Integrates instruction with teaching of other subjects.

Spain

South Korea

China

.696

.593

.006

.037

Source: Authors’ own.

Table 7. Scheffé's multiple comparison test.

When measuring the reality of their teaching work, it is the group of teachers from Spain, compared to their counterparts in South Korea, who, in their professional work and in a significant way, demand training related to the personalised attention to the diversity of their students (in this case the difference is also significant with regard to Chinese teachers), the promotion of autonomy through the teaching of metacognitive strategies, working with cognitive objectives at different levels, and responsible evaluation. Compared with Chinese teachers, this group of teachers considers the expert use of existing resources to be a priority. At the same time, the perception of success is generally valued to a greater extent by the Spanish group of teachers, understanding that personalised attention, working with cognitive objectives at different levels and interdisciplinarity are elements where more training is needed compared to Chinese and Korean teachers (this latter group of teachers does not reveal differences with other professionals in the other two countries in relation to working with cognitive objectives at different levels).

Table 8 lists, in order of priority, the training demands that teachers from the three participating countries perceive for their professional work to be identified with the principles of effective education.

Training demands related to

Spain

South Korea

China

Difference of means

Priority

Difference of means

Priority

Difference of means

Priority

1. Instructional objectives

1.049

3rd.

1.042

5th.

1.106

3rd.

2. Curriculum content and teaching strategies

0.829

6th.

0.583

9th.

0.789

6th.

3. Shared expectations

1.732

1st.

1.167

3rd.

1.211

2nd.

4. Expert use of existing resources

0.000

9th.

0.125

10th

0.316

9th.

5. Personalised attention and diversity

0.634

7th

0.792

8th

1.368

1st

6. The development of autonomy in the learning process

1.024

4th

1.292

2nd.

1.000

5th.

7. Working with cognitive objectives at different levels

1.000

5th.

1.750

1st

0.737

7th

8. Regular formative evaluation

1.024

4th

1.125

4th

0.789

6th.

9. Interdisciplinarity

1.220

2nd.

1.000

6th.

1.053

4th

10. Responsible evaluation

0.317

8th

0.833

7th

0.368

8th

Source: Authors’ own.

Table 8. Training demands perceived by each country.

The demand considered to be the top priority by the Spanish teaching group is training in the communication of expectations. Although not at the same level of preference, it is also a major demand raised by the Chinese and Korean teachers interviewed.

In contrast, the Chinese teachers’ strong desire to improve personalised attention to diversity does not correspond to the assessment given by teachers from Spain and South Korea, who place this demand in the lowest positions in the table. This is also the case with the strongly held view among the Korean teachers that training is required to work with cognitive objectives at different levels; a demand that is not seen to be of great relevance among the other groups, particularly the Chinese teachers.

As the data provided throughout this study have shown, none of the groups of teachers surveyed considers training in the expert use of existing instructional resources to be relevant. This element requires a more detailed study, due to the relevance this action has in saving time that could be dedicated to practices that enrich and clarify content (Ko, Sammons, & Bakkum, 2014).

Although the demand for training in relation to the expert knowledge of curriculum content and the strategies for teaching it is not deemed to be of great relevance among the Korean teachers participating in the research, it is of equal importance to the groups from Spain and China.

Finally, it should be emphasised that while work on developing shared responsibility for school outcomes is far from being a formative need for all teachers involved in this research, demands for instructional objectives and the communication of expectations once again reveal a substantial difference of opinion between the Korean teachers and the groups composed of Spanish and Chinese teachers. Only in the need to improve the capacity to carry out regular formative evaluations do the opinions of the Korean and Spanish teachers align, and in disagreement with the Chinese teachers.

4. Discussion and conclusions

It has been possible to design, through empirical procedures, a tool that reveals the training demands of teachers with regard to the principles of effective teaching. Its experimental application in a group of teachers from three different educational settings has shown that, for their teaching to be understood as effective, the following common elements must be promoted, as being of the greatest relevance to all: working with instructional objectives, curriculum content, teaching strategies and shared expectations; development of autonomy in the learning process; and regular and responsible formative evaluation.

These results support the work carried out by Lizasoain and Angulo (2014) who, within the strictly instructional field, found that effective work must be grounded in a methodological approach that is in turn based on competencies, which is responsive to the diversity of students, where practices are reinforced through student support and monitoring, and where evaluation is formative in nature. It is important to mention that teaching performance, as Cordingley (2015) states, will be successful as long as it is based on lifelong learning systems.

Taking into consideration systems of continuing professional development for teachers will have a significant impact on the improvement of professional practice (Lizasoain, Bereziartua, & Bartau, 2016) and will, therefore, make it possible to improve the learning acquired by the students. In this regard, the training of teachers for effective teaching must be based on five basic pillars (Reoyo, Carbonero, Martín, Román, Flores, & Freitas, 2015): communication, interest in the subject (commitment), treatment of the students (relationship), competency (training) and organisation, evaluation and exposure (methodology).

5. References

Arnal, J., Rincón, D. del, & Latorre, A. (1992). Investigación educativa. Fundamentos y metodología. Labor Universitaria.

Casillas, S., Cabezas, M., & García-Peñalvo, F. J. (2020). Digital competence of early childhood education teachers: attitude, knowledge and use of ICT. European Journal of Teacher Education, 43 (2), 210-223. https://doi.org/10.1080/02619768.2019.1681393

Coe, R., & Merino, C. (2003). Magnitud del efecto: una guía para investigadores y usuarios. Revista de Psicología, 21 (1), 146-177. https://doi.org/10.18800/psico.200301.006

Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences. Lawrence Eslabaum Associates.

Cohen, L., Manion, L., & Morrison, K. (2011). Research methods in education. Routledge.

Cordingley, P. (2015). The contribution of research to teachers’ professional learning and development. Oxford Review of Education, 41(2), 234-252. https://doi.org/10.1080/03054985.2015.1020105

Ebel, R. L., & Frisbie, D. A. (1986). Essentials of Education Measurement. Prentice Hall.

Gairín, J. (1996). Las de necesidades de formación. In J. Gairín, A. Ferrández, J. Tejada & A. Navío (Eds.), Formación para el empleo (pp. 71-116). CIFO.

Galindo, L. J. (1998). Técnicas de investigación en sociedad, cultura y comunicación. Pearson Educación.

Henson, R. K. (2001). Understanding internal consistency reliability estimates: A conceptual primer on coefficient alpha. Measurement and Evaluation in Counseling and Development, 34(3), 177-189. https://doi.org/10.1080/07481756.2002.12069034

Herreras, E. (2007). Análisis de Necesidades en el Proceso de Diseño de un Programa de Orientación. Revista Electrónica de Educación y Psicología, 3(5) 1-33.

Huertas, J. M. (2003). Metodología del Estudio de Necesidades. Universidad de Puerto Rico.

Jisu, H., Delorme, D. E., & Reid, L. N. (2006). Perceived third-person effects and consumer attitudes on preventing and banning DTC advertising. The Journal of Consumer Affairs, 40 (1), 90-166. https://doi.org/10.1111/j.1745-6606.2006.00047.x

Kaufman, R. (1982). Identifying and solving problems: A system approach. University Associates, Inc.

Kaufman, R. (2006). Change, choices and consequences, A guide to mega thinking and plannig. HRD Press.

Kaufman, R., & English, F. (1979). Needs Assessment Concept and Application. New Jersey: Educational Technology Publications, Englewood Cliffs.

Ko, J., Sammons, P., & Bakkum, L. (2014). Effective teaching. Education.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic Inquiry. Sage.

Lizasoain, L., Bereziartua, J., & Bartau, I. (2016). La formación permanente del profesorado en centros educativos de alta eficacia. Bordón, 68(2), 199-218. https://doi.org/10.13042/Bordon.2016.68213

Lizasoain, L., & Angulo, A. (2014). Buenas prácticas de escuelas eficaces del País Vasco. Metodología y primeros resultados. Participación Educativa, 3(4), 17-28.

McLaughlin, C. (2013). Teachers learning: professional development and education. Cambridge University Press.

McMillan, J. H., & Schumacher, S. (2006). Investigación Educativa. Una Introducción Conceptual. Pearson Educación.

Morales, P. (2006). Medición de actitudes en psicología y educación. Construcción de escalas y problemas metodológicos. Universidad Pontifica de Comillas.

Murillo, F. J. (2005). La investigación sobre eficacia escolar. Octaedro.

Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory. McGraw-Hill.

Rao, J. N. K., & Scott, A. J. (1981). The Analysis of Categorical Data from Complex Sample Surveys: Chi-Squared Tests for Goodness of Fit and Independence in Two-Way Tables. Journal of the American Statistical Association, 76, 221-230. https://doi.org/10.1080/01621459.1981.10477633

Reoyo, N., Carbonero, M. A., Martín, L. J., Román, J. M., Flores, M. V., & Freitas, A. (2015). Implicaciones para la formación docente: creencias de eficacia desde el profesorado de secundaria. International Journal of Developmental and Educational Psychology, 1(5), 427-432. https://doi.org/10.17060/ijodaep.2014.n1.v5.702

Román, M. (2010). Investigación latinoamericana sobre enseñanza eficaz. Revista Educación y Ciudad, (19), 81-96.

Thode, H. C. (2002). Testing for normality. Marcel Dekker. https://doi.org/10.1201/9780203910894