Skip to main content
PLOS One logoLink to PLOS One
. 2020 May 29;15(5):e0233613. doi: 10.1371/journal.pone.0233613

Communicative competence assessment for learning: The effect of the application of a model on teachers in Spain

Juan Jesús Torres-Gordillo 1,*, Fernando Guzmán-Simón 2, Beatriz García-Ortiz 1
Editor: Vasileios Stavropoulos3
PMCID: PMC7259689  PMID: 32469950

Abstract

The evolution of the results of Progress in International Reading Literacy Study in 2006, 2011 and 2016, as well as the difficulties found by teachers implementing the core competences, have led to the need to reflect on new assessment models. The objective of our research was to design a communicative competence assessment model and verify its effect on primary education teachers. The method applied was a focus group study. Participants came from four primary education schools in the province of Seville (Spain). The data were gathered through discussion groups. The COREQ checklist was followed. Qualitative thematic analysis of the data was carried out using Atlas-ti. An inductive coding scheme was established. The results have enabled the construction of a communicative competence assessment model and its application in primary education classrooms with HERACLES. The effects of the assessment model and the computer software were different according to teachers' profiles. On the one hand, teachers open to educational innovation remained positive when facing the systematic and thorough assessment model. On the other hand, teachers less receptive to changes considered the model to be complex and difficult to apply in the classroom. In conclusion, HERACLES had a beneficial effect on communicative competence assessment throughout the curriculum and made teachers aware of the different dimensions of communicative competence (speaking, listening, reading and writing) and discourse levels (genre, macrostructure and microstructure).

1 Introduction

Assessments carried out by the International Association for the Evaluation of Educational Achievement (IEA) in Spain have provided new evidence for the effects of the educational improvement measures applied in primary education in the last two decades. In particular, the assessment performed in Progress in International Reading Literacy Study (PIRLS) in 2006, 2011 and 2016 has shown how competence in communication in Spanish primary education has not progressed at the rate of that of other European countries [13].

The Spanish government and different regional authorities have implemented diverse improvement plans, which have been focused on the modifications of the official curriculum and on educational legislation to reverse this situation [46]. However, their results have not been expected in the area of communicative competence. Today, we are familiar with numerous definitions of communicative competence [713]. The publication in 2001 of the Common European Framework of Reference for Languages [14] has enabled us to describe the skills required for communication and their levels of achievement related to reading, writing, listening and speaking.

Moreover, the development of communicative competence in the educational curriculum must be related to ‘accountability’ within teaching programmes, which leads us to delve deeper into the link had by the school curriculum, based on key competences and their assessment. Training in the assessment of competences in general and of communicative competence in particular presents numerous deficiencies in the initial and continuous training of primary education teachers. Similarly, the difficulty in adapting the communicative competence theoretical concept to assessment in classrooms has led numerous authors to analyse the need to incorporate linguistic, cultural and social elements into the current educational context [15, 16]. Consequently, our paper focuses on the design and evaluation of a model for the assessment of communicative competence based on the Spanish curriculum through the use of a custom-designed computer application.

1.1 Assessment for learning

The design of an assessment model of communicative competence in the school context requires prior reflection related to the dimension assessment model, first, and an assessment of communicative competence, second. Our research started with a reflection on which assessment model for learning was the most appropriate for incorporating communicative competence assessment in the primary education classroom. Assessment for learning is considered an assessment that fosters students’ learning [1719]. Wiliam and Thompson [20] have developed five key strategies that enable this process to become an educational assessment:

  1. Clarify and share the learning intentions and criteria for success.

  2. Conduct effective discussions in the classroom and other learning tasks, which provide us with evidence of students’ comprehension.

  3. Provide feedback, which allows students to progress.

  4. Activate the students themselves as didactic resources for their peers.

  5. Foster the students as masters of their own learning.

The assessment that truly supports learning has two characteristics [21]: the feedback generated must provide information about learning activities for the improvement of performance, and the student must participate in actions for the improvement of learning based on heteroassessment, peer assessment and self-assessment.

Assessment for learning must set out by gathering information, which enables teachers and learners to be able to use it for feedback; that is, the result of the assessment must be information that both the teacher and the student can interpret for the improvement of the task. Wiliam [21] proposes an assessment that is incorporated into classroom programming and the information of which is relevant for the improvement of the teaching-learning process. Decision making for the improvement of the task must be based on the information that the assessment indicators contribute to the learning process. In conclusion, the effort that the school makes to emphasise the learning assessment is justified for the following reasons:

  1. Assessment must not be limited to marking (summative assessment); rather, it has to do with helping students learn [22].

  2. Assessment is a key element in effective teaching, as it measures the results of learning addressed in the teaching-learning process [21].

  3. Feedback plays a fundamental role and requires the information received to be used by students to improve their learning [23].

  4. Instead of being content with solving obstacles in students’ learning, teachers must offer opportunities from the assessment to develop learning strategies [24].

1.2 Communicative competence in the European educational framework

The theoretical construct on which our research is based has different sources. Since the 1960s, communicative competence has been approached in different ways [25], from Chomsky’s cognitive focus [26], followed by Hymes’ social approximation [11, 27], to Wiemann’s approximation of relational competence [28] and the approach based on the development of language of Bloom and Lahey [29] and Bryan [30]. Communicative competence in our research is founded on the works carried out by Bachman [7], Canale [31], Hymes [11] and the Common European Framework of Reference for Language: Learning, Teaching, Assessment (CEFR) [14].

The first allusion to the concept of ‘communicative competence’ came from Hymes [11]. He defined it as competence that uses the specific knowledge of a language’s structure; usually, there is not an awareness of having such knowledge, nor does one spontaneously know how it was acquired. However, the development of communication requires the presence of communicative competence between speakers [25].

Consequently, communicative competence not only is linked with formal aspects imposed from the structure of the language itself (grammatical) but also acquires meaning according to the sociocultural context in which it is developed. The incorporation of these sociocultural communication elements became the pillars of the models developed by Canale [31] and Bachman [7], which is the framework that the CEFR has adopted. In turn, the educational legislation in Spain has also carried out its particular adaptation to the national and regional context with the state regulation [5] and the regional law [32]. The particularity of the adaptation of communicative competence in primary education to national and regional educational legislation has brought about a certain confusion in the Spanish educational panorama. Nevertheless, the diverse conceptualisations of the communicative competence theoretical construct, found in the contributions of Canale [31] and Bachman [7] and in the different Spanish legislations (national and regional), maintain the same basic scheme of communicative competences.

Table 1 shows the correspondences between the different competences of the theoretical proposals of Canale [31], Bachman [7] and the state [33] and regional legislations [34] in Spain. A careful reading of this table highlights how the concept of communicative competence is not affected by the diverse terms used for its designation. Different authors and the legal texts propose the same parameters but present a different degree of specification and depth. The fundamental differences between the theoretical constructs of Canale [31] and Bachman [7] and the state [33] and regional legislations [34] are based on the creation of new competences, such as ‘personal competence’ (made up of three dimensions—attitude, motivation and individual differences—regarding communicative competence) in the first and ‘literary competence’ (referring to the reading area, the capacity of enjoying literary texts, etc.) in the second.

Table 1. Comparative table of the communicative competence components.

Canale [31] Bachman [7] and CEFR Ordinance 21/01/2015 [33] BOJA 17/03/2015 [34]
Grammatical (or linguistic) Organisational-grammatical competence Linguistic Linguistic or grammatical
Discursive (or pragmatic) Organisational-textual competence Pragmatic-discursive Textual or discursive
Sociolinguistic (or sociocultural) Pragmatic-sociolinguistic competence Sociocultural Sociocultural and sociolinguistic
Strategic Strategic competence Strategic Semilogical
Pragmatic competence-illocutionary competence Strategical or pragmatic
Personal
Literary

1.3 Communicative competence assessment

Communicative competence assessment must be considered in the process of the communicative teaching-learning of the language (‘communicative language teaching’ or CLT). This teaching model’s axis is ‘communicative competence’ [35]. This perspective aligns with that of Halliday’s systemic functional linguistics [36] and its definitions of the contexts of culture and situation [37]. Savignon’s CLT model [38] expands the previous research of Canale and Swain [8] and Canale [31] and adapts communicative competence to a school model (or framework of a competential curriculum). This model develops communicative competence regarding the ‘context’ and stresses communication’s functional character and its interdependence on the context in which it is developed. The communicative competence learning process in primary education is related to the implementation of programmes, which foster the participation of students in a specific communicative context and the regulation of the distinct competences to the social context of the classroom where the learning is performed.

Communicative competence assessment in our research expands upon Lave and Wenger’s notion of ‘community of practice’ [39], the ‘theories of genre’, which underline the use of language in a specific social context [36, 40], and the ‘theory of the socialisation of language’ [41, 42]. These notions are integrated into the acts of communication [11, 38, 43] and give rise to diverse communicative competences, which are disaggregated to be assessed.

The changes introduced into the curriculum (with the inclusion of key competences) and in the theories of learning (with the cognitive and constructivist conceptions) have forced the rethinking of assessment [44]. From this perspective, a new evaluation of communicative competence has been constructed from the improvement of the learning processes, not through certain technical measurement requirements [45]. Assessment based on competences or as an investigation has become an excellent model for solving the problem of communicative competence assessment.

Moreover, the modalities of heteroassessment and self-assessment [25] enhance the impact of assessment on children’s cognitive development. Basically, there are three factors that influence communicative competence assessment: (a) the culture and context of observation (the culture of the observers is different and makes use of distinct criteria), (b) standards (they cannot be applied to all the individuals of the same community) and (c) conflicts of observation (the valuations of the observations can apply the assessment criteria with a different measurement). Furthermore, Canale and Swain [8] previously underlined the differences between the assessment of the metadiscursive knowledge of competence and the capacity to demonstrate correct use in a real communicative situation. In their reflections, they proposed the need to develop new assessment formats and criteria, which must be centred on communicative skills and their relation between verbal and non-verbal elements.

The perspective adopted in this article sets out from the communicative competence assessment of the analysis of Halliday’s systemic functional linguistics [36] and its adaptation to the School of Sydney’s pedagogy of genres developed by Rose and Martin [46]. The School of Sydney’s proposal has as its starting point the development of an awareness of the genre in the speaker or writer [47]. Similarly, the discourse’s adaptation to the social context at which it is aimed (situation and cultural contexts) has to be taken into account.

In summary, communicative competence assessment sets out from the tools supplied by the analysis of the discourse [48], taking up some elements of diverse discursive traditions, such as pragmatic, conversational analysis and the grammar of discourse (for more information, see [49, 50]). These tools will respond to the levels of the genre, register and language (textual macrostructure and microstructure) [51, 52].

1.4 Aims

Setting out from these suppositions, this paper addresses the following aims:

  1. To design a communicative competence assessment model based on the Spanish primary education curriculum.

  2. To check the effect of the communicative competence assessment model on primary education teachers using a computer application.

2 Method

The research design is based on the use of the focus group technique for the study of the same reality, developed through four study groups. Each of these groups represents a school with different characteristics and profiles (see Table 2), enabling a multi-perspective approach, where schools represent different opinions and experiences. The COREQ checklist was followed. All the participants were informed of the nature and aim of the research, thus conforming to the rules of informed consent, and signed written consent forms [dx.doi.org/10.17504/protocols.io.bd8ei9te]. In addition, this research was approved and adhered to the standards of the Social Sciences of the Ethical Committee of Experimentation of the University of Seville.

Table 2. Students taking part in the research.

Course Student experimental group Student control group Teachers
2nd 127 91 6
4th 115 96 6
6th 126 98 8
Totals 368 285 20

2.1 Participants

Twenty teachers from the second, fourth and sixth years, belonging to four primary education centres in the province of Seville, took part in this study. Prior to consent, participants knew the objectives of the research project and the profiles of the researchers and agreed to collaborate voluntarily in the project. Participants were intentionally selected face-to-face for their diversity in school typology. In this way, participants were obtained from public, private and charter schools. Two of the initially contacted schools refused to participate due to technical problems with their Internet connectivity in the school and the staff’s lack of time to attend the training in the evaluation of communicative competence. Participant teachers undertook a training course on communicative competence assessment. The course was developed in the b-learning modality using the Moodle e-learning platform. During the training, teachers learned how to use a computer application to assess communicative competence using tablets. This tool, called the ‘tool for the assessment of linguistic communication competence’ (hereafter, HERACLES), was custom-designed. Later, teachers had the opportunity to implement what had been learned in their classes during a term. The application of the tool took place with 368 students in the experimental group and 285 in the group without the application (see Table 2).

After the application of the tool, the teachers were invited to participate in different discussion groups to note the results of the experience and the effect that HERACLES had on their training. The different focus groups were conducted in teachers’ workplaces by the three PhD authors of this paper, one female senior lecturer and two male senior lecturers from the universities of [authors] and experts in educational research. In two of the four schools, members of the management team also attended the focus groups, in addition to participant teachers. The discussion groups were audio-recorded and took place in the educational centres between June and September 2017.

2.2 Instruments

The analysis of the audio recordings of the discussion groups and the field notes taken has generated a system of inductive categories (see Table 3). This category system was compiled from the information provided by teachers in the discussion groups. The system of inductive categories was structured through a thematic frame based on the teaching staff’s experience in the use of a computer application to assess competence in communication in the classroom. The indicators focused on the ease of use of the computer tool, its usefulness in classroom evaluation, and teachers' assessment of the tool itself. The coding of the discussion group transcripts was performed by the three authors of the current paper. This system has been applied both in the codification phase and in the later analysis of relations with Atlas-ti. The focus group script was designed by the team of authors of this paper and was evaluated by six experts in educational research. Their analysis relied on input based on the understandability of the interview questions and on questions’ pertinence to the purpose of the research. The duration of the focus groups was approximately two hours. Recordings’ transcriptions were sent to the schools for review. The participants did not make any corrections to the content of the transcripts.

Table 3. System of inductive categories for communicative competence assessment through a computer application.

Categories Description
Applicability in daily use HERACLES enables the optimisation of the assessment
Methodological change The teacher-learning process changes due to the assessment
Mistakes of the tablet HERACLES does not facilitate the teacher-learning process
Learning phase The training of teachers facilitates the implementation of HERACLES
Assessment indicators The indicators of HERACLES facilitate the assessment for learning
Insufficient use of functions Teachers do not use the functions available in HERACLES
Positive opinion of the tool The use of HERACLES facilitates teachers’ work
Problem in digital literacy Teachers face difficulty in their digital competence
Problems of connections Teachers have problems related to connectiveness during the use of HERACLES

2.3 Data analysis

The first aim was accomplished through a comparative analysis of the communicative competence’s main components gathered in the models of Canale [31] and Bachman [7] and their relations with both national legislation [33] and regional legislation [32]. This analysis was the basis of the development of a communicative competence assessment model.

The second aim is approached through a qualitative thematic analysis [53, 54] of the discussion groups. The data analysis of the discussion groups’ recordings was carried out through Atlas-ti version 6.2. In the operationalisation phase [55], the system of inductive categories [56] was elaborated after listening to all the recordings. The codification of each discussion group was performed a posteriori by three researchers, and the coefficient of agreement between codifiers was calculated via the Fleiss’ kappa technique [57, 58].

Fleiss’ kappa calculation showed a value of K = 0.91 (see Table 4), which can be described as an excellent interjudge concordance [57]. The disagreement between the different coders was motivated by their interpretation of the application of the transcription categories, which was the result of the inductive process of the creation of the category system. These disagreements were solved through a process of iterative review and clarification of the indicators of the category scheme. After the categorisation of the focal group transcriptions, the three authors of this paper carried out a synthesis and summary of the data. The final report with the results of the research was sent to the different schools for review and feedback.

Table 4. Results of Fleiss’s kappa.

Kappa (K) ASE Z-Value P-Value
.90997038 .05456660 16.67632489 .00000000***

*p < .05, **p < .01, and

***p < .001.

Finally, we use different analyses of associations and semantic networks [59]. In the search for relations between the codes, we rely on the Atlas-ti Query Tool option. We similarly use the Network tool to carry out the graphic representation of these associations.

3 Results

3.1 A new communicative competence assessment model

The communicative competence assessment model proposed by Bachman [7] established a clear trend to measure competence as an interpersonal communication product. The elements that it proposes are based on an assessment of both the analysis of the environment of the assessment tasks (environment and type of test) and the indicators that differentiate diverse degrees of achievement of communicative competence in primary education (format, nature of the language, facet of response expected and relation between the input and output information).

The assessment model elaborated (see Table 5) presents the assessment indicators described generally. However, these indicators must be adapted to each of the tasks and genres evaluated in the classroom. The assessment tool was based on the application of distinct elements of the analysis of the discourse and on the selection and transformation of the elements into assessment indicators in the different dimensions. Table 5 presents examples of the assessment indicators related to the following aspects:

Table 5. The communicative competence assessment model.

Levels of discourse Oral expression Oral comprehension Written expression Written comprehension
Genre Macrostructure
1. Global coherence; 2. Linear coherence; 3. Textual cohesion
Textual or discursive Textual or discursive Textual or discursive Textual or discursive
a. Structure of the discursive sequence
b. Incorporation of narrative, descriptive, expositive and argumentative sequences
c. Type of discursive interaction
a. Recognises the structure of the discursive sequence
b. Extracts the main topic
c. Identifies the oral genre
a. topStructure of the prototypical textual sequence
b. Incorporation of the different micropropositions into the dominant macroproposition
a. Identifies the internal organisation of the text (thematic progression)
b. Extracts the main or global theme
c. Identifies the genre
d. Recognises the words read previously
Sociocultural Sociocultural Sociocultural Sociocultural
a. Norms of interaction and interpretation (turns speaking, presuppositions, and implications) a. Norms of interaction and interpretation b. Task of authentic assessment
c. The discursive genre in the classroom context
a. Relates the genre with the communicative aim
Pragmatic Pragmatic Pragmatic
a. Values the appropriateness of the field, tenor and mode a. topAdaptation of the linguistic register to the communicative aim
b. Appropriateness to the tenor
a. Responds to questions of inferential comprehension (not explicit in the text)
b. Responds to critical questions related to opinion
Strategic
a. Analysis of non-verbal elements
b. Analysis of the relationships between non-verbal and verbal elements
Microstructure
1. Lexical level; 2. Syntactic level
Linguistic
a. Scant lexical density and frequent redundancy, catchphrases, code phrases, etc.
b. Syntactic complexity and grammatical structures (clauses, groups and phrases)
c. Discursive connectors
d. Interprets pronouns, pauses and intonations to reinforce the textual cohesion
  1. the levels of discourse (genre, macrostructure and microstructure);

  2. the four communicative competence dimensions (speaking, listening, reading and writing);and

  3. the classification of each indicator according to its belonging to various competences (textual, discursive, sociocultural, pragmatic, strategic or semilogical).

The assessment of all these indicators in a school context made the development of the HERACLES computer application for tablets necessary. With this assessment tool (see Figs 1 and 2), it is possible to address not only the broad diversity of assessment indicators but also the heterogeneity of the students themselves, considering their individual variables.

Fig 1. HERACLES’ upper menu.

Fig 1

Reprinted from the COMPLICE project under a CC-BY license.

Fig 2. HERACLES’ assessment area.

Fig 2

Reprinted from the COMPLICE project under a CC-BY license.

This application enables the carrying out of a learning assessment, providing information concerning the communicative competence teaching-learning process in students during a prolonged period of time. The process assessment can be performed through diverse techniques, such as observation, thinking aloud, or interviews via stimulated recall. Similarly, HERACLES can relate the process’ assessment with that of the product through the analysis tools of the oral and written discourse. It was designed to facilitate students’ daily follow-up work, streamline the registering of students’ communicative competence development, gather information on the teaching-learning process and facilitate decision making for the programming of communicative-competence-related tasks. With this tool, the communicative competence learning assessment process is systematised and allows for the task’s assessment to be carried out efficiently and without excessive resource costs in the performance of the teaching work [60].

3.2 Effects of the use of the computer application of the communicative competence assessment

The second aim of this research has been addressed from the perspective of the qualitative thematic analysis of the discussion groups. The study of the effect is divided into two perspectives: the positive effects regarding the applicability of HERACLES and teachers’ methodological changes and the negative effects of its use. The positive effects have been characterised through causal relations (‘cause-effect’) or associative relations (‘related to’) (see Fig 3). The analyses performed have not shown any significant differences between the cases studied. Consequently, in this section, the different cases have not been described separately.

Fig 3. Graphic representation of the relations between codes about the use of the communicative competence assessment computer application carried out in Atlas-ti.

Fig 3

The positive effects are organised into three groups of relations. The first, composed of the causal relation of the applicability of daily use and methodological changes, tackles the changes detected in the methodology when HERACLES has been used with the tablets. In particular, the application of the communicative competence assessment criteria has enabled the improvement of the teaching-learning process in the centres analysed (‘the criteria of assessment (…) have helped me to focus on teaching’ [GD 1]). The communicative competence assessment has led some teachers to modify the assessment process, incorporating feedback (‘Yes, there are things I have proposed changing in the assessment: different forms of feedback with the students in the oral expositions and in the reading’ [GD 2]) and a process based on the learning assessment and adapted to the context of the classroom (‘Everything that is the theme of oral exposition and everything written (summaries) is something that I have had to introduce changes in to spotlight the assessment of the competence’ [GD 4]).

The second consists of the associative relation between the methodological change and the incorporation of assessment indicators in their daily activity. This has allowed for the evaluation of communicative competence dimensions that were not previously assessed in the classroom (‘I have used the tablet (…) when the children were speaking: if they gesticulated, if they stared, or if they used the appropriate vocabulary’ [GD 1]). In particular, the assessment of oral communication was developed due to the simple use of the tablet as an assessment instrument during the teaching-learning process (‘Not a specific activity or day, but rather, it depends on the tasks of each subject’ [GD 1]. Moreover, the ease of assessing communicative competence in very disparate circumstances within the school day permits this assessment to be extended to different areas of the curriculum (‘It was not specifically in the language class but in the classes in which they carried out a task or an activity’ [GD 1]). Finally, the use of indicators has generated the teaching perception of a more ‘objective’ assessment in the classroom (‘Assessment is an attitude, and it is very subjective. (…) The tool helps me to be more objective’ [GD 4]).

A third associative relation is established between maintaining a positive opinion about the use of HERACLES to assess communicative competence and the application of the daily use of the tablet as an assessment instrument. Teachers perceived the use of the assessment with tablets as simple and intuitive (‘It seemed to me quite simple and intuitive’ [GD 1]). The use of assessment tools and their indicators has led to their use being conceived as something easy and practical for the communicative competence assessment (‘It has been much more practical to assess according to the item they asked you’ [GD 3]). Similarly, the use of tablets relates HERACLES and its assessment with the facilitators of specific techniques, such as assessment through observation in the classroom (‘I would like to use it because it seems handier’ [GD 1]), making them quicker and more efficient in the current educational context.

The negative evaluations of the teachers have concentrated on the mistakes of the computer application. The relation between the mistakes and the methodological change is causal. Some difficulties found in the use of HERACLES have led to fewer effects on the methodological change. On the one hand, they are centred on the lack of a button to cancel the different notes recorded (‘I would have put the Yes/No option, but I missed the delete option’ [GD 1]). On the other hand, the difficulties come from the listing of the students being in an alphabetical order of their first names (and not by their surnames) and of the impossibility of selecting assessment indicators to adapt them to the task assessed and the age of the subjects (‘We did not have the option of marking which indicators we wanted to assess and which we did not’ [GD 1]). Finally, teachers suggest greater flexibility in being able to incorporate data from the group and students in HERACLES. In this sense, the computer application does not allow for an adaptation to a specific context or the modification of the communicative competence assessment model to adapt it to the programming of the classroom (‘I cannot continue using the material because it is closed’ [GD 2]).

4 Discussion

Our research has addressed the design and effect of a communicative competence assessment model through a computer application. The first aim proposed an evaluation design that facilitates a tool that helps teachers solve the complex process of assessment in the primary education classroom context.

The construction of an assessment model for communicative competence was based on the assessment of the learning concept in the context of the primary education curriculum. This model encourages a deeper analysis of communicative competence, incorporating the different competences involved (linguistic, pragmatic, strategic, etc.). Thus, the assessment of communicative competence (considered a formative assessment) requires a complex process of systematic data collection in the classroom, open to the different indicators determined by the model. In this way, teachers can evaluate communicative competence in different school subjects and develop improvement strategies aimed at one competence or another in a specific and personalised way. This proposal enables a clear heightened awareness of how the discourse has to be assessed, irrespective of the particularities of the assessment activities. This model enables the simple and systematic accessing of the analysis of the oral and written discourse, making it accessible to both teachers (in summative assessment) and students (through the feedback of the assessment for learning).

The application of this model as an assessment of communicative competence in primary education poses several problems. One of the problems of teachers in communicative competence assessment is the time cost that individualised attention requires. The proposed model advocates for a sustainable assessment [61]. The difficulty of communicative competence assessment requires teachers to address the complexity of the communicative competence teaching-learning process from an individualised perspective. This assessment model allows for reducing the time of this assessment and, in turn, addressing diversity respecting the learning rhythms. The learning assessment will only have an effect in the medium and long term when it is maintained over time. That is, both investment in teachers’ training time and handling of the data, which are obtained with computer applications, must be preserved to offer greater rapidity in the feedback and feedforward [18, 6264].

The second aim presents the effect of the communicative competence assessment model’s application on teachers through the use of a custom-designed computer application in primary education. The results reveal a polarisation between two profiles of teachers. The first brings together those who have a positive attitude towards the implementation of new assessment tools. For these teachers, the tool has been useful and has helped improve the communicative competence teaching-learning process. The second model groups those teachers who resist changes to the assessment models. For this group, the implementation of the new model presents numerous difficulties. The motives have resided in the conceptual comprehension of technology in general and of tablets in particular and the resistance to changes in an area such as the culture of school assessment. This resistance to the assessment model’s implementation has revealed how primary education assessment processes are the least porous to change in teachers’ continuous training process [65].

The assessment model’s application has enabled teachers of the first profile to incorporate communicative competence assessment into other curricular areas. The teachers understood that communicative competence assessment must not only be applied to Spanish language and literature. The model’s implementation has helped these teachers raise their awareness of assessment for key primary education competences [66].

5 Limitations and prospective research directions

The analysis of the research developed in this article has revealed some limitations. The first refers to the communicative competence assessment model. The indicators require teachers to adapt to the different assessment tasks. This possibility must be taken into account in the future development of the HERACLES assessment tool with a view to training the teachers and optimising its use in the classroom.

The effect of the results of the communicative competence model’s implementation in the studied centres showed that the processes of change in assessment require a greater time period. In this sense, some of the teachers did not attain a higher degree of advantage and systematicity in the use of the assessment tool model, as individual variables affected this model’s rhythm of implementation. Future research projects will have to expand upon the rhythms of learning of the teachers themselves when implementing improvements in the evaluation of the associated key competences.

Relatedly, the use of the HERACLES application presented some difficulties motivated by teachers’ scant development of digital competence. Consequently, this has meant a greater investment of time and effort in the adaptation of the assessment model and has brought about a certain dissatisfaction among participants due to their slow progress in the communicative competence assessment model’s changes.

Future works could extend the study to more educational centres that are interested in improving learning assessment. This would give greater potential to the impact it could have on primary education. Similarly, the HERACLES tool must be completed and modified by teachers with the aim of adapting it to each classroom’s teaching-learning processes. HERACLES must provide a model that is adapted later by the teacher to systematically and efficiently undertake the communicative competence assessment.

Supporting information

S1 Text. Information for teachers in participating school.

(DOC)

S1 Data

(SAV)

S2 Data

(HPR6)

Data Availability

All relevant data are within the paper and its Supporting Information files.

Funding Statement

This work was supported by the Ministry of Economics and Competitiveness of Spain [grant number EDU2013-44176-P] as an R+D project entitled “Mejora de la Competencia en Comunicación Lingüística del alumnado de Educación Infantil y Educación Primaria” (“Improvement of the Competence in Linguistic Communication of Early Childhood Education and Primary Education”, obtained in a competitive call corresponding to the State Plan of Advancement of Scientific and Technical Research of Excellence, State Subprogram of Generation of Knowledge, of the Secretary of State for Research, Development and Innovation.

References

  • 1.Mullis I.V.S., Martin M.O., Foy P., & Drucker K.T. (2012) PIRLS 2011 International results in reading. Chestnut Hill: International Association for the Evaluation of Educational Achievement (IEA). [Google Scholar]
  • 2.Mullis I.V.S., Martin M.O., Foy P., Hooper M. (2016) PIRLS 2016 International results in reading. Chestnut Hill: International Association for the Evaluation of Educational Achievement (IEA). [Google Scholar]
  • 3.Mullis I.V.S., Martin M.O., Kennedy A.M., & Foy P. (2007) PIRLS 2006 international report. Chestnut Hill: International Association for the Evaluation of Educational Achievement (IEA). [Google Scholar]
  • 4.Ley Orgánica 10/2002, de 23 de diciembre, de Calidad de la Educación. Available from: https://www.boe.es/eli/es/lo/2002/12/23/10
  • 5.Ley Orgánica 2/2006, de 3 de mayo, de Educación. Available from: https://www.boe.es/eli/es/lo/2006/05/03/2/con
  • 6.Ley Orgánica 8/2013, de 9 de diciembre, para la Mejora de la Calidad Educativa. Available from: https://www.boe.es/eli/es/lo/2013/12/09/8/con
  • 7.Bachman LF. Fundamental considerations in language teaching. Oxford: Oxford University Press; 1990. [Google Scholar]
  • 8.Canale M, Swain M (1980). Theoretical bases of communicative approaches to second language teaching and testing. Applied Linguistics. 1980; 1(1): 1–47. [Google Scholar]
  • 9.Celce-Murcia M. Rethinking the role of communicative competence in language teaching In Alcón E, Safont MP, eds. Intercultural language use and language learning. Netherlands: Springer; 2007. p. 41–57 [Google Scholar]
  • 10.Coseriu E. Competencia lingüística Elementos de la teoría del hablar [Linguistic competence. Elements of the theory of speaking]. Madrid: Gredos; 1992. [Google Scholar]
  • 11.Hymes D. On communicative competence In Pride IB & Holmes J, eds. Sociolinguistics. Baltimore, USA: Penguin Education, Penguin Books Ltd; 1972. p. 269–293. [Google Scholar]
  • 12.Savignon SJ. Communicative competence: Theory and classroom practice Texts and contexts in second language learning. Reading, Massachusetts (MA): Addison-Wesley Publishing Company; 1983. [Google Scholar]
  • 13.Widdowson HG. Learning purpose and language use. Oxford: Oxford University Press; 1983. [Google Scholar]
  • 14.Council of Europe. Common European Framework of Reference for Languages: Learning, teaching, assessment. Cambridge: Cambridge University Press; 2001. [Google Scholar]
  • 15.Leung C. Convivial communication: recontextualizing communicative competence. International Journal of Applied Linguistics. 2005; 15(2): 119–144. [Google Scholar]
  • 16.Leung C and Lewkowicz J. Language communication and communicative competence: a view from contemporary classrooms. Language and Education. 2013; 27(5): 398–414. [Google Scholar]
  • 17.Black P et al. Working inside the black box: Assessment for learning in the classroom. Phi Delta Kappan. 2004; 86(1): 4–17. [Google Scholar]
  • 18.Carless D. Scaling up assessment for learning: Progress and prospects In: Carless D, Bridges SM, Chan CKY and Glofcheski R, editor, Scaling up Assessment for learning in Higher Education. Singapore: Springer; 2017. p. 3–17. [Google Scholar]
  • 19.Hutchinson C and Young M. Assessment for learning in the accountability era: Empirical evidence from Scotland. Studies in Educational Evaluation. 2011; 37: 62–70. [Google Scholar]
  • 20.Wiliam D and Thompson M. Integrating assessment with instruction: What will it take to make it work? In: Dwyer CA, editor. The future of assessment: Shaping teaching and learning. Mahwah, NJ: Erlbaum; 2007. p. 53–82. [Google Scholar]
  • 21.Wiliam D. What is assessment for learning? Studies in Educational Evaluation. 2011; 37: 3–14. 10.1016/j.stueduc.2011.03.001 [DOI] [Google Scholar]
  • 22.Crooks TJ. The impact of classroom evaluation practices on students. Review of Educational Research. 1988; 58(4): 438–481. [Google Scholar]
  • 23.Black P and Wiliam D. Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability. 2009; 21: 5–31. 10.1007/s11092-008-9068-5 [DOI] [Google Scholar]
  • 24.Poehner ME and Lantolf JP. Dynamic assessment in the language classroom. Language Teaching Research. 2005; 9(3): 233–65. [Google Scholar]
  • 25.Tsai MJ. Rethinking communicative competence for typical speakers. An integrated approach to its nature and assessment. Pragmatics & Cognition. 2013; 21(1): 158–77. 10.1075/pc.21.1.07tsa [DOI] [Google Scholar]
  • 26.Chomsky N. Aspects of the theory of syntax. Cambridge: MIT Press; 1965. [Google Scholar]
  • 27.Hymes D. Foundations in Socio-Linguistics: an ethnographic approach. London: Tavistock Publications; 1977. [Google Scholar]
  • 28.Wiemann JM. Explication and test of a model of communicative competence. Human Communication Research. 1977; 3: 195–213. [Google Scholar]
  • 29.Bloom L and Lahey M. Language development and language disorders. New York: John Wiley & Sons, Inc; 1978. [Google Scholar]
  • 30.Bryan T. A review of studies on learning disabled children’s communicative competence In: Schiefelbusch RL, editor. Language competence: Assessment and intervention. San Diego, CA: College-Hill Press; 1986. p. 227–59. [Google Scholar]
  • 31.Canale M. From communicative competence to communicative language Pedagogy In: Richards JC and Smith R, editor. Language and Communication. London: Longman; 1983. p. 2–14. [Google Scholar]
  • 32.Decreto 230/2007, de 31 de julio, por el que se establece la ordenación y las enseñanzas correspondientes a la educación primaria en Andalucía. Available from: https://www.juntadeandalucia.es/boja/2007/156/1
  • 33.Orden ECD/65/2015, de 21 de enero, por la que se describen las relaciones entre las competencias, los contenidos y los criterios de evaluación de la educación primaria, la educación secundaria obligatoria y el bachillerato. Available from: https://www.boe.es/buscar/doc.php?id=BOE-A-2015-738
  • 34.Orden de 17 de marzo de 2015, por la que se desarrolla el currículo correspondiente a la Educación Primaria en Andalucía. Available from: https://www.juntadeandalucia.es/boja/2015/60/1
  • 35.Savignon SJ. Communicative competence In: Liontas JI, editor. The TESOL Encyclopedia of English Language Teaching. John Wiley & Sons; 2018. p. 1–7. 10.1002/9781118784235.eelt0047 [DOI] [Google Scholar]
  • 36.Halliday MAK. Language as social semiotic: the social interpretation of language and meaning. London: Edward Arnold; 1978. [Google Scholar]
  • 37.Halliday MAK and Hasan R. Language, context, and text: Aspects of language in a social-semiotic perspective. Victoria: Deakin University Press; 1985. [Google Scholar]
  • 38.Savignon SJ. Communicative language teaching: Linguistics theory and classroom practice In: Savignon SJ, editors. Interpreting communicative language teaching. Contexts and concerns in teacher education. New Haven & London: Yale University Press; 2002. p. 1–28. [Google Scholar]
  • 39.Lave J and Wenger E. Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press; 1991. [Google Scholar]
  • 40.Miller C. Genre as social action. Quarterly Journal of Speech. 1984; 70: 151–167. [Google Scholar]
  • 41.Lee JS and Bucholtz M. Language socialization across learning spaces In: Markee N editor. The handbook of classroom discourse and interaction. New York: John Wiley & Sons; 2015. p. 319–336. [Google Scholar]
  • 42.Ochs E. Introduction In: Schieffelin B and Ochs E, editors. Language socialization across culture. New York: Cambridge University Press; 1986. p. 1–13. [Google Scholar]
  • 43.Halliday MAK. Learning how to mean-Exploration in the development of language. London: Edward Arnold; 1975. [Google Scholar]
  • 44.Shepard, LA. The role of classroom assessment in teaching and learning. Los Ángeles, CA: CRESST–CSE Technical Report 517; 2000.
  • 45.Boud, D. Great designs: what should assessment do? International Online Conference sponsored by the REAP Project: Assessment design for learner responsibility; 2007 May 29–31; Available at: http://www.reap.ac.uk/reap07/Portals/2/CSL/boudpres/AssessmentREAPConference07Boud.zip
  • 46.Rose D and Martin JR. Learning to write, reading to learn Genre, knowledge and pedagogy in the Sydney School. Sheffield: Equinox; 2012. [Google Scholar]
  • 47.Silva Joyce H de and Feez S. Text based language and literacy education Programming and methodology. Putney (Australia): Phoenix Education; 2012. [Google Scholar]
  • 48.Paltridge B. Genre, frames, and writing in research settings. Amsterdam & Philadelphia: John Benjamins; 1997. [Google Scholar]
  • 49.Gee JP and Handford M, editors. The Routledge Handbook of Discourse Analysis. New York: Routledge; 2012. [Google Scholar]
  • 50.Paltridge B. Discourse Analysis An introduction. London: Continuum; 2006. [Google Scholar]
  • 51.Paltridge B. Working with genre: a pragmatic perspective. Journal of Pragmatics, 1995; 24:393–406. [Google Scholar]
  • 52.Van Dijk TA and Kintsch W. Strategies of Discourse Comprehension. Orlando: Academic Press; 1983. [Google Scholar]
  • 53.Fereday J and Muir-Cochrane E. Demonstrating rigor using Thematic Analysis: A hybrid approach of inductive and deductive coding and theme development. International Journal of Qualitative Methods, 2006; 5(1):80–92. [Google Scholar]
  • 54.Tuckett AG. Applying Thematic Analysis Theory to practice: A researcher's experience. Contemporary Nurse. 2005; 19(1–2):75–87. 10.5172/conu.19.1-2.75 [DOI] [PubMed] [Google Scholar]
  • 55.Corbetta P. Metodología y técnicas de investigación social. Madrid: McGraw-Hill; 2003. [Google Scholar]
  • 56.McMillan JH and Schumacher S. Investigación educativa. 5th ed. Madrid: Pearson Educación; 2005. [Google Scholar]
  • 57.Fleiss JL. Statistical methods for rates and proportions. New York: John Wiley & Sons; 1981. [Google Scholar]
  • 58.Viera AJ. and Garrett JM. Understanding interobserver agreement: The kappa statistic. Family Medicine, 2005; 37(5):360–3. [PubMed] [Google Scholar]
  • 59.Friese, S. (2013). ATLAS.ti 7 User Manual [Internet]. Berlin, Germany: Scientific Software Development. 2013 [revised 2018; cited 2019 Mar 29] Available from: http://atlasti.com/manual.html
  • 60.Krasch D and Carter D. Monitoring and evaluating classroom behavior in Early Childhood setting. Early Childhood Education Journal, 2009; 36(6):475–82. [Google Scholar]
  • 61.Boud D. Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education. 2000: 22(2), 151–67. 10.1080/713695728 [DOI] [Google Scholar]
  • 62.Carless D, Salter D, Yang M and Lam J. Developing sustainable feedback practices. Studies in Higher Education. 2011; 36(4): 395–407. [Google Scholar]
  • 63.Dawson P, Henderson M, Mahoney P, Phillips M, Ryan T, Boud D, et al. What makes for effective feedback: staff and student perspectives. Assessment & Evaluation in Higher Education. 2019; 44(1):25–36. 10.1080/02602938.2018.1467877 [DOI] [Google Scholar]
  • 64.García-Jiménez E. La evaluación del aprendizaje: de la retroalimentación a la autorregulación. El papel de las tecnologías. RELIEVE. 2015; 21(2). 10.7203/relieve.21.2.7546 [DOI] [Google Scholar]
  • 65.Monarca H and Rappoport S. Investigación sobre los procesos de cambio educativo: el caso de las competencias básicas en España. Revista de Educación. 2013; número extraordinario:54–78. 10.4438/1988-592X-RE-2013-EXT-256 [DOI] [Google Scholar]
  • 66.Perrenoud P. Cuando la escuela pretende preparar para la vida. ¿Desarrollar competencias o enseñar otros saberes? Barcelona: Graó; 2012. [Google Scholar]

Decision Letter 0

Vasileios Stavropoulos

13 Feb 2020

PONE-D-19-33902

Communicative competence in assessment for learning: effect of the application of a model on teachers in Spain

PLOS ONE

Dear Dr. Torres-Gordillo,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Athough the topic is interesting, changes are mainly required in the delivery of the content, the introduction and the methods sections, as per reviewers comments.  Given the qualitative nature of the study, the  accuracy of language and content is deemed significant. ​

We would appreciate receiving your revised manuscript by Mar 29 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Vasileios Stavropoulos

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

http://www.journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and http://www.journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We suggest you thoroughly copyedit your manuscript for language usage, spelling, and grammar. If you do not know anyone who can help you do this, you may wish to consider employing a professional scientific editing service.  

Whilst you may use any professional scientific editing service of your choice, PLOS has partnered with both American Journal Experts (AJE) and Editage to provide discounted services to PLOS authors. Both organizations have experience helping authors meet PLOS guidelines and can provide language editing, translation, manuscript formatting, and figure formatting to ensure your manuscript meets our submission guidelines. To take advantage of our partnership with AJE, visit the AJE website (http://learn.aje.com/plos/) for a 15% discount off AJE services. To take advantage of our partnership with Editage, visit the Editage website (www.editage.com) and enter referral code PLOSEDIT for a 15% discount off Editage services.  If the PLOS editorial team finds any language issues in text that either AJE or Editage has edited, the service provider will re-edit the text for free.

Upon resubmission, please provide the following:

  • The name of the colleague or the details of the professional service that edited your manuscript

  • A copy of your manuscript showing your changes by either highlighting them or using track changes (uploaded as a *supporting information* file)

  • A clean copy of the edited manuscript (uploaded as the new *manuscript* file)

3. PLOS ONE will consider submissions that present new methods, software, or databases as the primary focus of the manuscript if they meet the criteria of utility, validation, and availability described here: http://journals.plos.org/plosone/s/submission-guidelines#loc-methods-software-databases-and-tools. To meet these criteria, please provide supporting materials enabling other teachers and researchers to replicate your teaching intervention such as sample worksheets, a detailed lesson plan or curriculum or other educational materials. If you include supporting materials, they should not be under a copyright more restrictive than CC-BY.

4. Please state specifically whether the IRB approved the study.

5. Please provide additional details regarding participant consent. In the ethics statement in the Methods and online submission information, please ensure that you have specified (1) whether consent was informed and (2) what type you obtained (for instance, written or verbal, and if verbal, how it was documented and witnessed). If your study included minors, state whether you obtained consent from parents or guardians. If the need for consent was waived by the ethics committee, please include this information.

Please provide a copy of the topic guide as Supplementary Information as well as English translations of the screenshots and provided materials.

6. We note that Figures 1 and 2 in your submission contain copyrighted images. All PLOS content is published under the Creative Commons Attribution License (CC BY 4.0), which means that the manuscript, images, and Supporting Information files will be freely available online, and any third party is permitted to access, download, copy, distribute, and use these materials in any way, even commercially, with proper attribution. For more information, see our copyright guidelines: http://journals.plos.org/plosone/s/licenses-and-copyright.

We require you to either (1) present written permission from the copyright holder to publish these figures specifically under the CC BY 4.0 license, or (2) remove the figures from your submission:

1.         You may seek permission from the original copyright holder of Figures 1 and 2 to publish the content specifically under the CC BY 4.0 license.

We recommend that you contact the original copyright holder with the Content Permission Form (http://journals.plos.org/plosone/s/file?id=7c09/content-permission-form.pdf) and the following text:

“I request permission for the open-access journal PLOS ONE to publish XXX under the Creative Commons Attribution License (CCAL) CC BY 4.0 (http://creativecommons.org/licenses/by/4.0/). Please be aware that this license allows unrestricted use and distribution, even commercially, by third parties. Please reply and provide explicit written permission to publish XXX under a CC BY license and complete the attached form.”

Please upload the completed Content Permission Form or other proof of granted permissions as an "Other" file with your submission. 

In the figure caption of the copyrighted figure, please include the following text: “Reprinted from [ref] under a CC BY license, with permission from [name of publisher], original copyright [original copyright year].”

2.    If you are unable to obtain permission from the original copyright holder to publish these figures under the CC BY 4.0 license or if the copyright holder’s requirements are incompatible with the CC BY 4.0 license, please either i) remove the figure or ii) supply a replacement figure that complies with the CC BY 4.0 license. Please check copyright information on all replacement figures and update the figure caption with source information. If applicable, please specify in the figure caption text when a figure is similar but not identical to the original image and is therefore for illustrative purposes only.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: No

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The authors of this paper sought to establish a framework for communicative competence (CC) assessment in Spanish classrooms and establish the acceptability of using a tablet-based tool for this assessment. The qualitative methods were appropriate and the authors summarise many useful themes that will be helpful in furthering the design and implementation of their tool and program. However, this manuscript was challenging to read in parts and could benefit from considering the following suggestions for improvement:

* The manuscript requires English editing to improve the clarity. For example the authors state, “This leads is to go deeply into the link…” but it is not clear which linked concepts are being explored. I assume they mean the link between CC in the curriculum and accountability in teaching programs? Phrases such as “Future research projects will have to set out from the rhythms of learning of the teachers themselves when implementing improvements in the evaluation of the key competences” do not quite make sense i.e. it is not clear to an English speaker what “rhythms of learning” means.

* One focus of English editing should be more consistent use of tense. It is not always clear if the authors are describing something that was done or found in their study (past tense) or something that currently occurs in classrooms (present tense).

* The rationale of the study is not clear from the first two paragraphs. The authors clarify this somewhat later, however this rationale should be clear from the beginning of the paper, as should clear definitions of key concepts, particularly CC.

* It is not clear why the learning assessment framework is being introduced where is appears, or what its role is. Is this an established framework for integrating CC into curriculums? Or is this a framework the authors proposed might be useful for achieving this?

* The definition of Learning assessment is circular and should be expanded upon.

* It is not clear to a reader not familiar with this area what the term “Denomination” refers to specifically in this context. Please clarify this.

* I’m not sure what the source “own elaboration” for Table 1 refers to?

* More detail is required regarding how the themes were established and categorised. What established thematic framework was used? Who specifically were the raters? How were disagreements addressed? Who synthesised and summarised the data once it was categorised?

* I am not clear what “ad hoc designed computer application” means. Custom-designed?

Reviewer #2: Overall, this is an interesting and worthwhile area of research, with implications for improving assessment of primary school students. Further review of the paper by the authors is recommended prior to publication, to ensure that all important terms are clearly explained, the rationale for the research is more clearly outlined at the outset, the methodology is clearly linked to the data analysis, and both aims are clearly defined. Please note that the "no" comment to the manuscript being presented in an intelligible fashion is referring to the clarity of writing overall. There are many strong aspects to this paper, but also important areas that could be written more clearly.

Please see below for comments about each section of the paper.

Introduction

The Introduction could be clearer in providing an orientation to the topic and key questions to be addressed. It mentions PIRLS without explaining this, and while this might be common knowledge for those working in education, some brief explanation of PIRLS would be helpful for other readers. Furthermore, it is not quite clear what is being said in regard to improvements or changes in primary school students – when saying, “The results of the improvement in reading of Spanish students in Primary Education (hereafter, PE) present a slight improvement on the PIRLS assessment (2006; 2011; 2016)”, it is not clear what improvements are being discussed, what is meant by a “slight improvement”, and over what time period this is being mentioned (i.e. is it since the last PIRLS assessment?). This slight improvement appears to be discussed as an area of concern, but again, it is unclear why this is of concern, so more background and context here would be helpful.

There is good discussion of what is meant by Communicative Competence; however, it would be helpful to more clearly outline earlier what was meant by the results of CC not being what was expected, so as to clarify how they were not what was expected.

Aims

Generally clear, but Aim 1 could be more specific – given you are discussing PE teachers, it would be good to specify here whether the CC assessment model was based on PE curriculum development.

Method

In this section, please briefly clarify why a multiple case study design has been chosen as the methodological framework, so there is a sound rationale for this.

More detail is needed here to conform to the Consolidated criteria for reporting qualitative studies (COREQ): 32-item checklist, in all domains listed in that checklist. Please review this checklist and add in missing information.

While it appears that participants were teachers rather than students, students are mentioned as if they are part of the sample. It is important to clarify whether they were participants, and if so, what procedures were used for informed consent.

Results

As with the Method, this does not conform to all domains of the COREQ, so further review of these is recommended.

The chosen methodology is a multiple case study design; however, the results do not appear to be written in this manner. You may wish to either reconsider whether the multiple case study approach is appropriate (I would suggest it is not, given it typically involves multiple sources of information for each case), and rewrite this section with a different methodology as appropriate.

Discussion

Generally clear, with the second aim of the study being discussed well. Writing more clearly about the first aim of the study would be helpful here – it is discussed, but could be clearer.

Limitations

Good discussion of limitations, such as potential lack of digital competence within teachers, which is an important area for further education and research.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Alana Howells

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 May 29;15(5):e0233613. doi: 10.1371/journal.pone.0233613.r002

Author response to Decision Letter 0


25 Mar 2020

Dear reviewers,

1. The authors have incorporated substantial improvements in the text aimed to increase the research rigour of this paper. These modifications have affected all sections of the paper, especially the introduction, methodology and discussion.

2. The authors have not conducted a statistical analysis in this manuscript, as the research design was geared towards the analysis of qualitative data. In the case of Fleiss’ kappa calculation, calculations are correct.

3. The authors have completed the information required by the reviewers throughout the text, and have incorporated new complementary documents related to the development of the research. However, this research must guarantee the anonymity of the participants, as well as the educational institutions to which they belong. These criteria were part of the commitment made by the researchers to the participants before the start of the research.

4. The authors have conducted a thorough review of the academic writing of this paper. To this end, the authors have carefully reviewed the text of the paper and, subsequently, the paper was submitted to the AJE for the review of its academic style. As a result of both revisions, the authors consider that the text is now clearer and more accurate.

Reviewer #1

1. The authors have carried out a profound revision of the academic style of the paper. The modifications to the text have been recommended by AJE and highlighted in the text in grey.

2. The authors have reviewed the use of the paper's verb tenses. The modifications to the text have been recommended by AJE.

3. Rationale of the study:

The authors have modified the rationale of the research and have reviewed the definitions of key concepts (such as communicative competence and assessment for learning). The modifications to the text have been highlighted in the text in grey. The authors have incorporated the following texts into the main paper:

“Assessments carried out by the International Association for the Evaluation of Educational Achievement (IEA) in Spain have provided new evidence for the effects of the educational improvement measures applied in primary education in the last two decades. In particular, the assessment performed in Progress in International Reading Literacy Study (PIRLS) in 2006, 2011 and 2016 has shown how competence in communication in Spanish primary education has not progressed at the rate of that of other European countries [1-3]”.

“Today, we are familiar with numerous definitions of communicative competence [7-13]. The publication in 2001 of the Common European Framework of Reference for Languages [14] has enabled us to describe the skills required for communication and their levels of achievement related to reading, writing, listening and speaking”.

“Consequently, our paper focuses on the design and evaluation of a model for the assessment of communicative competence based on the Spanish curriculum through the use of a custom-designed computer application.”

4. The authors have reviewed the relationship between communicative competence and assessment for learning throughout the paper. The modifications of the text have been highlighted in the text in grey. The authors have incorporated the following texts to the main paper document:

“The design of an assessment model of communicative competence in the school context requires prior reflection related to the dimension assessment model, first, and an assessment of communicative competence, second. Our research started with a reflection on which assessment model for learning was the most appropriate for incorporating communicative competence assessment in the primary education classroom. Assessment for learning is considered an assessment that fosters students’ learning [17-19].”

5. The authors are not drawing on the "learning assessment" concept, but on the “assessment for learning”. The misunderstanding was provoked by a mistake in the translation. In the revision, the authors have applied this later concept throughout the text.

6. The authors have rephrased the paragraphs indicated by the reviewer. The authors have incorporated the following texts into the main paper:

“Nevertheless, the diverse conceptualisations of communicative competence theoretical construct, found in the contributions of Canale [22] and Bachman [21], and in the different Spanish legislations (national and autonomous), maintain the same basic scheme of communicative competences.”

“A careful reading of this table highlights how the concept of communicative competence is not affected by the diverse names used for its designation. The different authors and the legal texts propose the same parameters but present a different degree of specification and depth.”

7. The authors have removed this comment “own elaboration” for better understanding of the text.

8.

a.How the themes were established and categorised.

The category scheme was developed in an inductive way. That is, the categories emerged directly from the analysis of the information provided by the teachers in the focus groups. The following sentence has been added to the main document: "This category system was compiled from the information provided by teachers in the discussion groups".

b. What established thematic framework was used?

The authors incorporate the following text into the main document:

"The inductive system of categories was structured through a thematic frame based on the teaching staff’s experience in the use of a computer application to assess competence in communication in the classroom. The indicators focused on the ease of use of the computer tool, its usefulness in classroom evaluation, and teachers' assessment of the tool itself".

c. Who specifically were the raters?

The authors incorporate the following text into the main document:

" The coding of the discussion group transcripts was performed by the three authors of the current paper".

d. How were disagreements addressed?

The authors incorporate the following text into the main document:

"The disagreement between the different coders was motivated by their interpretation of the application of the transcription categories, which was the result of the inductive process of the creation of the category system. These disagreements were solved through a process of iterative review and clarification of the indicators of the category scheme".

e. Who synthesised and summarised the data once it was categorised?

The authors incorporate the following text into the main document:

" After the categorisation of the focal group transcriptions, the three authors of this paper carried out a synthesis and summary of the data".

9. The authors have rephrased these paragraphs:

“To check the effect of the communicative competence assessment model on primary education teachers using a computer application.”

“This tool, called the ‘tool for the assessment of linguistic communication competence’ (hereafter, HERACLES), was custom-designed.”

“Consequently, our paper focuses on the design and evaluation of a model for the assessment of communicative competence based on the Spanish curriculum through the use of a custom-designed computer application.”

“The second aim presents the effect of the communicative competence assessment model’s application on teachers through the use of a custom-designed computer application in primary education. The results reveal a polarisation between two profiles of teachers.”

Reviewer #2

1. The authors have incorporated clarifications regarding international PIRLS assessments. The authors have rewritten this section as follows:

“Assessments carried out by the International Association for the Evaluation of Educational Achievement (IEA) in Spain have provided new evidence for the effects of the educational improvement measures applied in primary education in the last two decades. In particular, the assessment performed in Progress in International Reading Literacy Study (PIRLS) in 2006, 2011 and 2016 has shown how competence in communication in Spanish primary education has not progressed at the rate of that of other European countries [1-3]”.

“However, their results have not been expected in the area of communicative competence. Today, we are familiar with numerous definitions of communicative competence [7-13]. The publication in 2001 of the Common European Framework of Reference for Languages [14] has enabled us to describe the skills required for communication and their levels of achievement related to reading, writing, listening and speaking.”

"Consequently, our paper focuses on the design and evaluation of a model for the assessment of communicative competence based on the Spanish curriculum through the use of a custom-designed computer application.”

"The design of an assessment model of communicative competence in the school context requires prior reflection related to the dimension assessment model, first, and an assessment of communicative competence, second. Our research started with a reflection on which assessment model for learning was the most appropriate for incorporating communicative competence assessment in the primary education classroom. Assessment for learning is considered an assessment that fosters students’ learning [17-19]."

2. Aims: The authors have rewritten the aims of this study as follows:

(1) To design a communicative competence assessment model based on the Spanish primary education curriculum.

(2) To check the effect of the communicative competence assessment model on primary education teachers using a computer application.

3. Method:

The authors have reviewed the 32-item checklist and modified the main text when necessary. The following information has been incorporated:

a. The different focus groups were conducted in teachers’ workplaces by the three PhD authors of this paper, one female senior lecturer and two male senior lecturers from the universities of [authors] and experts in educational research.

b. Prior to consent, participants knew the objectives of the research project and the profiles of the researchers and agreed to collaborate voluntarily in the project.

c. Participants were intentionally selected face-to-face for their diversity in school typology. In this way, participants were obtained from public, private and charter schools.

d. Two of the initially contacted schools refused to participate due to technical problems with their Internet connectivity in the school and the staff’s lack of time to attend the training in the evaluation of communicative competence.

e. In two of the four schools, members of the management team also attended the focus groups, in addition to participant teachers.

f. The focus group script was designed by the team of authors of this paper and was evaluated by six experts in educational research. Their analysis relied on input based on the understandability of the interview questions and on questions’ pertinence to the purpose of the research.

g. The analysis of the audio recordings of the discussion groups and the field notes taken has generated a system of inductive categories (see Table 3).

h. The duration of the focus groups was approximately two hours.

i. Recordings’ transcriptions were sent to the schools for review. The participants did not make any corrections to the content of the transcripts.

j. The final report with the results of the research was sent to the different schools for review and feedback

4. Results:

The authors have revised the methodology to conform to the reviewers' recommendations. The text that clarifies this issue is as follows:

"The research design is based on the use of the focus group technique for the study of the same reality, developed through four study groups. Each of these groups represents a school with different characteristics and profiles (see Table 2), enabling a multi-perspective approach, where schools represent different opinions and experiences. The COREQ checklist was followed. All the participants were informed of the nature and aim of the research, thus conforming to the rules of informed consent, and signed written consent forms".

5. Discussion:

The authors rewrite the Discussion section clarifying those aspects related to the first aim. Thus, the authors incorporate the following fragment to the main text:

“The construction of an assessment model for communicative competence was based on the assessment of the learning concept in the context of the primary education curriculum. This model encourages a deeper analysis of communicative competence, incorporating the different competences involved (linguistic, pragmatic, strategic, etc.). Thus, the assessment of communicative competence (considered a formative assessment) requires a complex process of systematic data collection in the classroom, open to the different indicators determined by the model. In this way, teachers can evaluate communicative competence in different school subjects and develop improvement strategies aimed at one competence or another in a specific and personalised way. This proposal enables a clear heightened awareness of how the discourse has to be assessed, irrespective of the particularities of the assessment activities. This model enables the simple and systematic accessing of the analysis of the oral and written discourse, making it accessible to both teachers (in summative assessment) and students (through the feedback of the assessment for learning)”

“The application of this model as an assessment of communicative competence in primary education poses several problems. One of the problems of teachers in communicative competence assessment is the time cost that individualised attention requires. The proposed model advocates for a sustainable assessment [61]. The difficulty of communicative competence assessment requires teachers to address the complexity of the communicative competence teaching-learning process from an individualised perspective. This assessment model allows for reducing the time of this assessment and, in turn, addressing diversity respecting the learning rhythms. The learning assessment will only have an effect in the medium and long term when it is maintained over time. That is, both investment in teachers’ training time and handling of the data, which are obtained with computer applications, must be preserved to offer greater rapidity in the feedback and feedforward [18, 62-64]”.

6. Limitations:

Thanks.

Attachment

Submitted filename: Response to Reviewers5.docx

Decision Letter 1

Vasileios Stavropoulos

11 May 2020

Communicative competence assessment for learning: the effect of the application of a model on teachers in Spain

PONE-D-19-33902R1

Dear Dr. Torres-Gordillo,

We are pleased to inform you that based on the two reviewers recommendations your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it complies with all outstanding technical requirements.

Within one week, you will receive an e-mail containing information on the amendments required prior to publication. When all required modifications have been addressed, you will receive a formal acceptance letter and your manuscript will proceed to our production department and be scheduled for publication.

Shortly after the formal acceptance letter is sent, an invoice for payment will follow. To ensure an efficient production and billing process, please log into Editorial Manager at https://www.editorialmanager.com/pone/, click the "Update My Information" link at the top of the page, and update your user information. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, you must inform our press team as soon as possible and no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

With kind regards,

Vasileios Stavropoulos

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: (No Response)

Reviewer #2: Great work on improving this paper. It is now much clearer and straightforward. You have addressed all the issues appropriately.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Alana Howells

Acceptance letter

Vasileios Stavropoulos

20 May 2020

PONE-D-19-33902R1

Communicative competence assessment for learning: the effect of the application of a model on teachers in Spain

Dear Dr. Torres-Gordillo:

I am pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

For any other questions or concerns, please email plosone@plos.org.

Thank you for submitting your work to PLOS ONE.

With kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Vasileios Stavropoulos

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Text. Information for teachers in participating school.

    (DOC)

    S1 Data

    (SAV)

    S2 Data

    (HPR6)

    Attachment

    Submitted filename: Response to Reviewers5.docx

    Data Availability Statement

    All relevant data are within the paper and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES