Keywords
Teaching English; Second language; English for a Specified Purpose; Military English; Training Strategies; NATO Level 2; Functional exams; CEF Level B2; Independent Users exams.
Teaching English; Second language; English for a Specified Purpose; Military English; Training Strategies; NATO Level 2; Functional exams; CEF Level B2; Independent Users exams.
English language teaching, learning, and testing carry high stakes not only for individual military members, but also for countries aiming to meet common language goals; this also applies to North Atlantic Treaty Organization (NATO) countries. Regarding the goal of Standard Language Profile (SLP) exams, military personnel from NATO countries aspire to join appointments that require NATO English Levels. Furthermore, there are many applicants seeking positions within NATO that also require a high level of English. SLP exams are based on testing four skills – listening, speaking, reading, and writing – following the STANAG 6001 criteria.a It is the responsibility of each country to establish its own training structure, design its syllabi and teaching materials, implement a testing framework, develop tests, and monitor training outcomes for these exams.
Given the scarcity of research in the training to achieve NATO Level 2(L2) within the specific field of the armed forces the present study bridges this gap by addressing the development of these four skills by using a new methodology, that tries to improve training and results. In order to do this, the study explores the benefits of using different techniques and newer methodologies that are adapted to NATO L2 exams. The results will help to clarify the gaps in training and education programs. The results of this study may have useful implications for training planners and tests administrators to achieve more satisfactory results. This study also undertook a literature review in order to obtain valuable sources to consult for the formation of the new training. Consequently, the overall aim of the study is to improve an L2 training program and to explore its effect in the field of NATO L2 within STANAG 6001 to try and reduce unsatisfactory results. This will then be tested to see if the improvements lead to better scores and skill levels.
The Council of Europe has been active in the promotion of modern language learning and teaching since the signing of the European Cultural Convention in 1954. In 1989, member states agreed a set of issues on which it would be useful to organize programmes of research and development.b These were:
• An enriched model for specifying objectives
• Making use of mass media and new technologies
• Bilingual education
• The role of educational and cultural links, visits, and exchanges
• Learning to learn and the promotion of learner autonomy
There has been a rapid expansion in the membership of the Council for Cultural Co-operation (a subdivision of the Council of Europe) following the political changes in Central and Eastern Europe around 1990. The Council has provided important guidelines for the reform and re-orientation of language teaching in new member states, as it was done previously in the principal project activity. The beginning of the 1990s was characterised by a general shift in focus. With the enlargement of the European Union, the importance of political, cultural, and scientific cooperation suddenly started to take centre stage. The Council resolved to make special efforts in national policies to promote a common understanding, in particular among young people, through cultural exchanges, co-operation in all fields of education and, more specifically, through teaching and training in the languages of other participating States.1,c
A Frameworkd to provide support the above, was published in English (Cambridge University Press), French (Hachette) and German (Langenscheidt).
The Framework consists of a descriptive scheme setting out an analysis of language use and of the many ‘competences’, i.e., the shared knowledge and skills, which enable users of a language to communicate with each other. Wherever possible, these are separately calibrated with brief descriptors defining six levels of proficiency. Overall progress is also calibrated in this way. The Framework does not set out to prescribe standards but provides a basis for all involved in the teaching/learning process to reflect, plan, and communicate their decisions on objectives, methods, and achievements transparently and in compatible terms.
According to the manual “Language Policy Unit, Strasbourg”,e the CEF of Reference for Languages provides a common basis for the elaboration of language syllabuses, curriculum guidelines, examinations and textbooks across Europe. It describes in a comprehensive way what language learners must learn in order to communicate in a language and what knowledge and skills they have to develop so as to be able to do so effectively. The description also covers the cultural context in which language is embedded. The Framework also defines different common levels of proficiency, which allow learners’ progress to be measured at each stage of learning and on a life-long basis. Following the official manual explanation, the CEF is intended to overcome the barriers to communication among professionals from different background (education, psychology, medicine, social work), and from the different educational systems in Europe. It provides the means for educational administrators, course designers, teachers, teacher trainers, and examining bodies to reflect on their current practice, with a view to situate and co-ordinate their efforts and to ensure that they meet the real needs of learners.
Giving formal recognition to such abilities will help to promote multilingualism through the learning of a wider variety of European languages.
The uses of the Framework include:
• The planning of language learning programmes
• The planning of language certification
• The planning of self-directed learning
• Learning programmes and certification
Language schools and certificate bodies evaluate their own equivalences against the framework. Differences of estimation have been found to exist, for example, with the same level on the PTE A (The Pearson Test of English Academic), TOEFL (Test of English as a Foreign Language), and IELTS (International English Language Testing System), and is a cause of debate between test producers. The CEF methodology has been extended to describe and evaluate the proficiency of users of programming languages when the programming activity is considered as a language activity.
NATO is an intergovernmental military alliance between 29 countriesf and was established in 1949. It constitutes a system of collective defence and the members signed the agreement to mutual defence in case of an external attack. These countries have participated in many international missions and joint operations around the world. Because of that diversity, a common language is required to understand and be understood. Some of the missions required communication and cooperation with the rest of the armed forces members. Sometimes these missions were administrative duties, some were combatting missions. The need for linguistic communication was particularly important in order to avoid any misunderstanding because of the risk in case of language mistakes.g
In 2003, The Bureau for International Language Co-ordination (BILC),h which is the consultative and advisory body for language training in NATO, decided to release a document in order to standardize the language training and testing for all NATO members.
The Standard agreement was called STANAG 6001,i and it explains in detail the language proficiency levels that are considered the most appropriate in order to participate in joint international missions. Since that agreement, NATO members only deploy personnel to missions if they have certified the appropriate levels according with the standards. NATO countries strictly follow the Standard Agreements, and they allocate a large number of financial resources to language training.j But the training program is the responsibility of each country in order to achieve the SLP. They must establish their own materials and training plans. As Green & Wall (2005)2 reported in their study “some teams have taken a general English approach in their testing, others have incorporated a military flavour‟, and still others have used texts taken from military sources and tasks based on military scenarios”.
The language proficiency skills are broken down into six levels, coded 0 through 5. In general terms, levels may be defined as follows: Level 0, no proficiency; Level 1, survival; Level 2, functional; Level 3, professional; Level 4, expert; Level 5, highly-articulate/native.
A series of plus (+) descriptions is provided. A plus indicator may be added to a base level for training, evaluation, recording, or reporting purposes, to indicate a level of proficiency that substantially exceeds a 0 through 4 base skill level but does not fully or consistently meet all of the criteria for the next higher base level. In general terms, plus levels may be defined as follows: Level 0+, memorized proficiency; Level 1+, survival+; Level 2+, functional+; Level 3+, professional+; Level 4+, expert+
Language proficiency profiles are recorded using a sequence of 4 digits, with plus indicators if/when applicable, to represent the four language skill areas, and those skills will be listed in the following sequence: Skill L Listening; Skill S Speaking; Skill R Reading; Skill W Writing. This four-digit number will be preceded by the code letters SLP to indicate that the profile shown is the Standardised (S) Language (L) Profile (P). (For example: SLP 3321 means level 3 in listening, level 3 in speaking, level 2 in reading and level 1 in writing). The highest level of proficiency is level 5, and the lowest level of proficiency is level 0.
CEF levels of proficiency are broken down mainly into three levels as Basic, Independent and Proficient users and each level is divided into two (1 and 2): A Basic User [A1 (Breakthrough); A2 (Waystage)]; B Independent [User B1 (Threshold); B2 (Vantage)]; C Proficient User [C1 (Operational Proficiency); C2 (Mastery)].
Table 1 shows a comparison between CEF an NATO Language Standards in detail.
Source: www.campaignmilitaryenglish.com.
Figure 1 shows the overall research design, which took place from May 2016 to November 2018. The research was split into four separate studies: 1) Preliminary survey/interview of military personnel about NATO L2 training; 2) gathering data from NATO L2 exams after normal training; 3) gathering of data from NATO L2 exams after training strategies proposal (TSP); 4) Post-training survey. Data was collected from examinees over a period of six months or one year, using mixed data collection methods: surveys, interviews, and exam results. Following Johnson & Christensen (2004),3 the interviews provided the participants’ perspectives that lie behind the proposed training before and after the NATO L2 exams. The mixing of the quantitative results and qualitative findings occurred in the final discussion, in which the study highlighted the quantitative results and the complexities that surfaces from the qualitative results.
The TSP was implemented between study 2 and 3 and will be discussed in more detail below. The TSP was either six-months or one-year. A variety of training strategies, a knowledge of student’s levels and the implementation of which strategies were best for any examinee, helped the trainer to know which program (six-months or one-year) would be the most effective for each examinee.
The study and the application of the TSP were based on a mixture of methods and approaches, including audiolingual method, direct method and physical response method.4 This mix of methods was done in order to meet the different needs of the examinees to achieve the NATO L2.
The participant selection was carried out by the trainer in accordance with the training strategies and the capacities of the group with the approval of the selected examinees.
A total of 50 participants took part in the study and they started the first period of observation and analysis of the results after the first NATO L2 exams, but only 40 of them took part in the TSP. A total of 20 examinees took part in the six-month TSP and 20 took part in the one-year TSP. The participants were purposively selected to obtain a representative sample from different military categories and ranks: 10 officers (rank from Lieutenant to Colonel); 20 non-commissioned officers (e.g. NCO-visual leaders; Sergeant to Sergeant Major); 20 enlisted personnel (rank from private to all grades of corporal).k
The first two studies involved one-on-one survey administration method (giving tests and psychological instruments). This method is very time-intensive but can yield rich qualitative data related to the participant's cognitions, feelings, and behaviours, and a face-to face interview. It was completed by 50 examinees immediately before the NATO L2 exams and after having been trained applying some training programs.
The last two phases involved the one-on-one survey administration method followed by a face-to face interview. It was completed by 40 examinees immediately after the NATO L2 exams and after having been received the TSP.
This study examined factors most strongly associated with unsatisfactory results of the previous L2 training program. The study collected information from all 50 participants in face-to-face, researcher-given surveys.
Study 1 was carried out through a set of 20 open-ended questions (Box 1) to gather information about the examinees´ opinions after their NATO L2 training (TSP not applied) and before the NATO L2 exam. 50 participants took part in this study. The questions could be applied as a part of any other language training course evaluation, i.e. this is not restricted to NATO exams. Five of the questions were open-ended questions that ask the examinees to provide their opinion. The remaining questions were “scale and rate” close-ended questions (examinees were asked to rate the degree to which they are agree or are satisfied with) in order to quantify the problem by generating numerical data to be transformed into usable statistics (scale = 1-5; a higher score indicates the more the respondents were satisfied/agreed/important).
1. How many SLP courses have you attended? Please explain the way the courses were designed and structured.
2. What do you think about your current English level after the training and before the L2 exam?
3. Do you think that L2 is a fair measure of your English level? Please explain in detail.
4. Do you have any suggestions on how to train any of the four skills?
5. What advice would you give to another examinee who is considering taking this training?
The “scale and rate” close-ended questions:
1. How important is it for you to pass the L2 exam?
2. Have you received information about the L2 exam?
3. How motivated are you to study English?
4. Did you learn English before you started attending the NATO L2 training?
5. How do you perceive your training in order to take a test to achieve the NATO L2?
6. Are you satisfied with the whole training?
7. Would you think the trainers did a good job?
8. Did you receive clear feedback from the trainer?
9. Has the ongoing evaluation been useful for your motivation?
10. Has the training been efficient in improving the four skills?
11. Has the training been well-organized?
12. Has the classroom been well-equipped with good technical tools and materials?
13. Should the training last longer?
14. Do you think that the training is better with fewer students per class?
15. Do you expect to achieve NATO L2 after the training?
Exam data was collected from the participants (n = 50) after they had taken the L2 exam.
The elements, materials, and procedures for conducting the TSP were extracted from previous literature, and methods, military manuals (STANAG 6001 and BILC webl), books and essays. The training was personalized in accordance with the examinees needs. In this changing and competitive world, a good command of English can determine the candidates’ future in the military life.
For the proposed training program to be successful (NATO L2 achievement), the four skills (reading, listening, speaking, and writing) must be integrated in an effective way. These skills must be addressed in a way that helps examinees develop their language competence gradually. Listening and speaking are highly interrelated and work simultaneously in some real-life situations. So, the integration of the two skills aims at fostering effective oral communication in order to achieve NATO L2 in both skills. Reading and writing are also intimately associated. Developing students’ competencies in reading and writing requires exposing students to gradually challenging reading materials and writing tasks.
The TSP tries to make students read and write effectively. The proposed names for the training of each skill are outlined below:
(1) Listening: “Listening fatigue stairs”: The training carried out a step-by-step program increasing the duration/intensity/length/level every month. The concept of “fatigue” is a subjective feeling that has a gradual onset of tiredness. Keeping in mind that the examinees normally lose concentration after 15-20 minutes (according to observation of the examinee by the tester), the proposed training carried out a step by step listening training increasing the duration/intensity/length/level every month. The objective was to adapt to this intensity in order to eliminate the fatigue.
(2) Speaking: “Breaking the ceiling”: The main goal of the tester is to find the examinee’s limit of knowledge through intense training. The most important failure in examinees´ performance is that sometimes they get stuck, and they need the tester´s help to continue the oral proficiency interview (OPI). Consequently, some of them feel frustrated and they fail. Thus, one of the missions of the tester is to sample the examinee's language and elicit the highest level the examinee can reach. Some questions that are relatively difficult are intentionally asked in order to find the upper limits of the examinee's ability. The OPI allows a tester to assign a global rating that describes what a speaker can do with the language.
(3) Reading: “Read, Breathe, Read, comprehend”: The training considers that the need to develop a habit to read is very important. The skill will be developed and acquired automatically once the habit has been established. The best and easiest approach is however, to start with making a small effort to read a piece of text a day. The second goal is the reading of a book every month. The training points out the reading and comprehension techniques recommended from our experience and close observation. The reading training proposal carries out a period of more than a year reading books (novels, short stories, fiction. Trainers must train the examinees in guessing the meaning of unknown words from the context. One of the proposals of the training is to read anything enjoyable (one way to assimilate information unconsciously and save it automatically) from which pleasure can be derived in addition to military matters, announcements, and discussions about everyday life, that will be essential in order to achieve NATO L2. Finally successful reading comprehension requires the integration and application of multiple strategies (memory, cognitive, compensation, metacognitive, affective, social and test-taking), and the training will focus the attention on those strategies, in particular, the test-taking strategy.
(4) Writing: “Drafting and Multi-Wording”: The training focuses the attention on the importance of “Drafting” (transferring thoughts about the topic into main ideas to be developed, organizing the main ideas and concepts, creating and conceptual map and translating the ideas and thoughts into words, phrases, and sentences) and “Multi-Wording” (trying to find synonyms or play with chosen words and sentences to have a particular effect; a kind of brainstorming is included).
Additional information about what the TSP contained can be found in the Extended data.
Six-month training: discarding the 10 examinees who achieved NATO L2 (see Study 2), 20 examinees with the best SLP results from the previous L2 exam were chosen. The criterion was the selection of examinees who achieved levels from L1 to less than L2 approx.
One-year training: discarding the 10 examinees who achieved NATO L2 (see Study 2) and the 20 examinees who were chosen to follow the six-months TSP training, the remaining 20 examinees were chosen. The criterion was the selection of examinees who achieved levels from L0, L0+ to less than L1 approx.
Exam data was collected from the participants (n = 40) after they had taken the L2 exam.
Study 4 carried out a survey to the participants that had been through the TSP training, but administering them a face-to-face survey (20 questions; Box 2). These were used to gather information about the examinees’ opinions after TSP training and after the NATO L2 exam.
1. How long has your TSP training lasted? Please explain how it was.
2. What do you think about your current English level after your TSP training?
3. Do you think that the TSP training and NATO L2 exam are a fair measure of your English level? Please explain in detail.
4. Do you have any suggestions about the TSP training?
5. What advice would you give to another examinee who is considering taking this training?
Scale and rate questions:
1. How important has the TSP training been for you to pass the L2 exam?
2. Have you received more information about the L2 exam during the TSP training?
3. How motivated are you to continue studying English?
4. Do you think that you are ready to start L3 training?
5. How did you perceive your training for sitting the L2 exam?
6. Are you satisfied with the whole TSP training?
7. Would you think the trainers did a good job?
8. Did you receive clear feedback from the trainer?
9. Has the ongoing evaluation been useful for your motivation?
10. Has the training been efficient in improving the four skills?
11. Has the training been well-organized?
12. Has the classroom been well-equipped with good technical equipment?
13. Should the training last longer?
14. Do you think that the class must have fewer students per class?
15. Do you expect to achieve L2 again after a month or a year?
In accordance with Study 1, five questions were open ended, and the rest were scale-and-rate (scale = 1-5; a higher score indicates the more the respondents were satisfied/agreed/important).
Motivation, interaction, follow-up, and feedback instead of monotony and apparently redundant programs were important factors for the participants during L2 training. The responses of the “scale and rate” close-ended questions are shown in Table 2.
See Box 1 for more details on these questions. Numbers in red indicate the scale with the highest number of responders.
The study intended to quantify attitudes, opinions, behaviors, and other variables to draw relevant conclusions. This part is supplemented by the open-ended questions about motivations, expectations, and feelings (see Box 1 for more information on these questions).
The goal was to obtain a better idea of the examinees´ point of view on L2. It was shown that although examinees were very interested in achieving L2, they didn’t have enough information about the L2 exam. On the other hand, the motivation of the examinees to sit the NATO L2 exams was acceptable and most of them had experience in the preparation of NATO L2 exams. The majority of the participants weren´t confident with the English language and weren´t satisfied with the training, trainers, and the organization of the course. Moreover, the examinees felt that they needed longer courses with fewer students.
From the open-ended questions, few of the examinees had attended other SLP courses and most of them thought that they had improved their level of proficiency after the training, but not enough to pass the L2 or to cover their expectations. It is particularly noteworthy that almost all of them felt that the program was redundant and monotonous.
Important variations were seen in L2 outcomes since there were different training programs. The showed that after various Functional Level trainings only 10 of 50 examinees achieved NATO L2. This is a poor outcome considering the fact that they were supposed to have an acceptable level of proficiency.
The TSP was divided into two kinds of training: six-months or one year, depending on the examinee level (after NATO L2 exam results mentioned above).
After the six-month TSP training, 9 out of 20 examinees achieved L2.
After the one-year TSP training, 6 out of 20 examinees achieved L2.
Therefore, after TSP only 15 of 40 examinees achieved NATO L2. However, it can be interpreted that NATO L2 exam results did improve after the TSP training as 15 examinees who previously did not pass, passed after the TSP.
Responses of the “scale and rate” close-ended questions are shown in Table 3.
See Box 2 for more information. Numbers in red indicate the scale with the highest number of responders.
The goal of this study was to carry out an in-depth analysis of the responses in order to have a clear idea of the examinees´ point of view. It was obvious that the examinees were very interested in achieving NATO level 2. The examinees stated that they were very satisfied with the TSP training and they had received enough information about the L2 exam. In addition, their motivation had been increasing and most of them were interested in improving their English learning processes. This puts a compelling case for a review of the effectiveness and efficiency of the TSP training.
Most of the participants reported not being confident enough in starting L3 training because of the different levels of difficulty between both levels (NATO L2 and NATO L3). In relation to the TSP training, they were satisfied with the high level of the trainers, tools, materials, equipment and the TSP organization and program. On the other hand, some examinees felt again that they needed longer courses with fewer students.
From the open-ended questions, both examinees that attend the six-month and one-year TSP training reported that they had improved their level of proficiency after the TSP training than before and that they have improved after the TSP training and enough to achieve L2 and to cover their expectations. It is particularly noteworthy that almost all of them felt that the TSP training program was motivational and interesting. Therefore, most of them recommended it to other examinees.
Highlighting some examinees’ perspectives about the whole process:
1. Most of them felt that TSP training had connected the L2 four skills learning and the learning of English language per se.
2. They were aware of the importance of carrying out a long program with fewer students and a variety of activities despite the intense “listening fatigue stairs”.
3. L2 achievement is a real challenge, and they need to be monitored by trainers to relate the examinee´s previous knowledge to their personal motivation.
4. They need to feel able to take risks, without being frightened of making mistakes. Trainers must take up the challenge of searching for their best English.
This study attempted to explore the application of the proposed TSP for the Acquisition of Functional Skills in English for examinees who wanted to achieve NATO L2 and for the assessment of the examinees’ point of view. The primary wish behind this study was to try and show the variations in L2 outcomes since there were different training programs. It is shown that the NATO L2 exam results after the TSP had improved. Therefore, the study intended to show that the unsatisfactory results because of the low rate of success (only 10 of 50 examinees achieved NATO L2 after the application of generic NATO L2 training) in order to achieve NATO L2 and the level differences among the examinees could be due primarily to the training program carried out.
Study 1: 50 participants took part in the survey and interview about generic NATO L2 trainings. The survey indicates that the motivation of the examinees to sit the NATO L2 exams was acceptable and most of them had experience in the preparation of NATO L2 exams. I would like to highlight the fact that most of them weren´t confident about themselves and they weren´t satisfied with the training, trainers, and the organization of the course. Moreover, the examinees felt that they needed longer courses with fewer students. It was shown that few of the examinees had attended other SLP courses and most of them thought that they had improved their level of proficiency after the training, but not enough to pass the L2 or to cover their expectations. It is particularly noteworthy that almost all of them felt that the program was redundant and monotonous.
Study 2: Of the 50 participants that took the NATO L2 exam, only 10 passed.
Study 3: Of the 20 participants that took the NATO L2 exam after the 6 month TSP, 9 passed the NATO L2 (45%). Of the 20 participants that took the NATO L2 exam after the one year TSP, 6 passed the NATO L2 (35%). Therefore, only 15 of 40 (37.5%) examinees achieved NATO L2 after the TSP.
Study 4: 40 participants took part in the survey and interview about the application of the proposed training (TSP). The survey indicates that the examinees were very interested in achieving NATO level 2. It is noteworthy that they were very satisfied with the proposed TSP training and the examinees´ reported that they had received enough information about the L2 exam. In addition, their motivation had increased and most of them were interested in improving their English learning processes. Most of them weren´t confident enough to start L3 training because of increased difficulty between NATO L2 and L3. The examinees’ p were satisfied with the high level of the trainers, tools, materials, equipment and the TSP organization and program. On the other hand, some examinees felt again that they needed longer courses with fewer students. The most important conclusion is that most of them thought that they had improved their level of proficiency after the TSP training.
In relation to the overall aim of the study, the results show that there is a real need for the application of other training in order to successfully achieve NATO L2 requirements. It is commonly known that it is very difficult for all examinees with different levels to achieve 100% satisfactory results. Therefore, the TSP could help NATO L2 trainers and private academies to make changes and improvements.
The data collected showed that only 10 of 50 (20%) examinees achieved L2 after the application of some Functional Level trainings, while 15 of the remaining 40 (37.5%) examinees achieved L2 after the proposed TSP training. It is shown that there are examinees who show a greater improvement and other examinees who show a sign of a modest improvement. It therefore follows that the NATO L2 exam results after the TSP have improved considerably (from 20% to 37.5% success rate).
In accordance with the results obtained from the surveys of the examinees, the most important differences between the two training styles could be the increased motivation, interaction, follow-up, and feedback of the TSP instead of monotonous and redundant programs (according to the information gathered in Studies 1 and 4). Consequently, other points to consider concern the application of some Functional Level training in order to achieve NATO L2 due to the fact that they have not been submitted to significant revision in recent years.m
Finally, the study didn´t demonstrate if there is any benefit from the implementation of the proposed TSP program to any civilian scope (CEF). But given the fact that there are similarities between both fields (military SLP exam and CEF requirements), it could be concluded that the TSP training could be applied to other fields. Future studies should be done to assess if the TSP training is useful in the civilian comprehensive testing system, such as the Test of English as a Foreign Language (TOEFL), the International English Language Testing System (IELTS) or The Cambridge Assessment of Spoken English (CASE).
In conclusion, it is important to consider the results of the study and the TSP training outcomes in order to find an appropriate training which could be useful not only in NATO testing system but in all comprehensive testing systems. The need to find a common training could be the starting point for critically examining the teaching/learning testing systems for better planning and the development of an effective implementation at all levels of proficiency.
Due to the restrictions surrounding military data records, the underlying data for this study cannot be provided. This includes the data survey data in Studies 1 and 4 and the NATO L2 scores in Studies 2 and 3. Researchers wishing to access this data should liaise with the corresponding author (mnunesp@et.mde.es), who will facilitate obtaining the data through military channels (Spanish Ministry of Defence).
DANS: Training for NATO level of English language, https://doi.org/10.17026/dans-xzh-54fv.5
This project contains the following extended data:
• A proposal of some training strategies for the improvement of NATO Level 2 results in the Functional Level exams in English in the field of the Armed Forces.pdf (official results from all participants, the summary of the surveys and the follow up during the training)
• TFM LIA MIGUEL ÁNGEL NÚÑEZ ESPINOSA.pdf (Master´s Thesis, containing a full description of the TSP)
These files are under restricted access due to the restrictions noted above. Access will be facilitated through the DANS website and approved by the corresponding author.
Eva Samaniego Fernández confirms that the author has an appropriate level of expertise to conduct this research, and confirms that the submission is of an acceptable scientific standard. Eva Samaniego Fernández declares they have no competing interests. Affiliation: Universidad Nacional de Educación a Distancia.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the work clearly and accurately presented and does it cite the current literature?
Yes
Is the study design appropriate and is the work technically sound?
Yes
Are sufficient details of methods and analysis provided to allow replication by others?
Yes
If applicable, is the statistical analysis and its interpretation appropriate?
Partly
Are all the source data underlying the results available to ensure full reproducibility?
Partly
Are the conclusions drawn adequately supported by the results?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Morphology, syntax, corpus linguistics, experimental linguistics
Is the work clearly and accurately presented and does it cite the current literature?
No
Is the study design appropriate and is the work technically sound?
No
Are sufficient details of methods and analysis provided to allow replication by others?
No
If applicable, is the statistical analysis and its interpretation appropriate?
No
Are all the source data underlying the results available to ensure full reproducibility?
No
Are the conclusions drawn adequately supported by the results?
No
References
1. Aryadoust V, Foo S, Ng L: What can gaze behaviors, neuroimaging data, and test scores tell us about test method effects and cognitive load in listening assessments?. Language Testing. 2022; 39 (1): 56-89 Publisher Full TextCompeting Interests: No competing interests were disclosed.
Reviewer Expertise: English language teaching and learning; ESP; course design
Is the work clearly and accurately presented and does it cite the current literature?
Partly
Is the study design appropriate and is the work technically sound?
Yes
Are sufficient details of methods and analysis provided to allow replication by others?
No
If applicable, is the statistical analysis and its interpretation appropriate?
Yes
Are all the source data underlying the results available to ensure full reproducibility?
No
Are the conclusions drawn adequately supported by the results?
Partly
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: military terminology, ESP, intercultural communication
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | |||
---|---|---|---|
1 | 2 | 3 | |
Version 1 22 Dec 21 |
read | read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)