About the Author(s)


Willem C. van Wyk symbol
Department of Applied Languages, Faculty of Humanities, Tshwane University of Technology, Pretoria, South Africa

Gary W. Collins Email symbol
Department of Applied Languages, Faculty of Humanities, Tshwane University of Technology, Pretoria, South Africa

Maria M. Swanepoel symbol
End-user Computing Unit, Faculty of Information and Communication Technology (ICT), Tshwane University of Technology, Pretoria, South Africa

Citation


Van Wyk, W.C., Collins, G.W. & Swanepoel, M.M., 2024, ‘Assessing the reliability of success predictors in English proficiency among journalism students’, Transformation in Higher Education 9(0), a379. https://doi.org/10.4102/the.v9i0.379

Original Research

Assessing the reliability of success predictors in English proficiency among journalism students

Willem C. van Wyk, Gary W. Collins, Maria M. Swanepoel

Received: 06 Mar. 2024; Accepted: 29 July 2024; Published: 29 Aug. 2024

Copyright: © 2024. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Journalism’s societal role hinges on effective communication, demanding proficient language skills. This study assesses a South African University of Technology’s Department of Journalism, probing existing selection methods’ efficacy in predicting success in two English modules. Concerns persist about these methods accurately identifying students with essential language proficiency. The research explores alternatives for better assessing language abilities crucial for journalistic proficiency. The literature review underscores language proficiency’s significance in journalism education, especially in African contexts, observing challenges faced by South African universities. Current practices, relying on high school English grades and standardised tests, may not precisely reflect required journalism-specific language skills. Using a non-experimental quantitative design, the study analyses correlations between selection methods and student performance in English modules. Findings indicate a weak correlation, advocating for a more comprehensive approach with alternative assessments. Proposed alternatives include writing samples, portfolio reviews, and situational judgement tests for a holistic selection process. The study suggests future research directions, emphasising broader generalisability and exploring correlations with practical skills and critical thinking. The study emphasises the imperative for journalism departments to reevaluate and enhance selection criteria, aligning them with dynamic professional demands.

Contribution: This article contributes to journalism education by illuminating the shortcomings of current selection methods in assessing potential for success in English language-related modules. It proposes alternatives and advocates for a holistic approach, providing practical insights for journalism departments aiming to align their selection criteria with the evolving demands of the profession. Ultimately, this contribution aids in the development of skilled journalists.

Keywords: journalism; selection methods; language proficiency; academic performance; South African higher education; holistic evaluation; journalism education.

Introduction

This study aimed to evaluate the effectiveness of matriculation English grades, selection tests, and interviews as predictors of student success in English I and II modules within a journalism programme. By examining the correlation between these traditional success predictors and academic performance, we sought to determine their utility in identifying students poised for success in the complex field of journalism.

Journalism plays a vital role in society, serving as a bridge between complex information and the public. Effective communication is crucial for journalists to fulfil their responsibilities, which encompass tasks such as disseminating news, conducting interviews, and crafting compelling narratives. While language proficiency is widely recognised as essential for journalistic success, concerns persist regarding the effectiveness of current selection methods in identifying students with the necessary English language skills.

Current practices often rely on metrics such as high school English grades and standardised tests, which may not accurately reflect the specific language abilities needed for journalism. These traditional methods might assess symbol-based knowledge rather than real-world communication skills crucial for the profession. This potential disconnect can be further highlighted by the discrepancy between high school English grades and the observed language proficiency of students entering journalism programmes.

This study aims to address this gap by investigating the effectiveness of current selection methods used by the Department of Journalism at a prominent South African University of Technology (UoT). We specifically examine whether these methods accurately predict success in English I and II modules, considered foundational courses for developing journalistic skills. Additionally, we explore the need for alternative selection tools that better assess the essential language abilities required for journalistic proficiency within academic and professional contexts.

Literature review

The following review explores the existing body of knowledge concerning language proficiency and its role in journalism education, with a specific focus on the African context and the challenges encountered by South African universities. Additionally, it explores the intricacies of admissions assessments in the contemporary university landscape, emphasising the necessity for a balanced approach that accounts for both standardised measures and individual strengths pertinent to fields such as journalism.

Language and the essence of journalism

At its core, journalism revolves around effectively communicating information to the public. This communication relies on the precise and sophisticated use of language. Nkoala (2020) contends that language shapes our perception of reality; words not only label things but also actively construct our understanding of the world. In journalism, where clarity and accuracy are paramount, language proficiency becomes indispensable.

This proficiency is critical for journalism students on multiple levels. Strong language skills enable students to grasp the complex concepts explored in lectures and coursework (Taylor & Von Fintel 2016). Without a firm foundation in the language of instruction, students may struggle to follow along, potentially leading to feelings of inadequacy and exclusion (Meier & Hartell 2009). However, Nkoala (2020) warns against placing undue emphasis on a single dominant language. Journalism education should recognise the value of diverse communication styles and explore the potential of multilingual journalism practices.

Beyond academia, language proficiency is a cornerstone of professional success in journalism. Studies such as Fosu’s (2011) reveal a concerning trend: strong language skills are often taken for granted in journalism education. Graduates may find themselves ill-equipped to write clear and concise news stories, hindering their ability to effectively communicate with the public. Madinga, Maziriri and Lose (2016) further underscore the challenges faced by international students encountering language barriers. These barriers can significantly impede their ability to gather information, conduct interviews, and craft compelling and accurate stories.

Mulligan and Kirkpatrick (2000) examine closely the impact of language barriers, highlighting how they can hinder not only information gathering but also student self-confidence and help-seeking behaviours. Ramburuth and Tani (2009) echo this sentiment, suggesting that language barriers can lead to decreased student engagement and a sense of alienation from the learning process. Muswede (2016) argues that today’s journalists require a broad knowledge base that encompasses not only technical know-how but also high competence in linguistic skills. They must be able to adapt their language use to diverse audiences and contexts.

Basilan and Padilla (2023) emphasise the responsibility of educators to equip students with a strong foundation in literacy and language skills. They argue that education departments play a crucial role in ensuring effective journalism programmes within educational institutions. By prioritising language development, educators can empower students to overcome challenges associated with low literacy levels and limited vocabulary (Basilan & Padilla 2023).

By fostering strong language skills, educators can empower students to excel in their academic pursuits and become effective communicators within the field of journalism. This not only benefits individual journalists but also strengthens the quality of journalism, fostering a more informed and engaged citizenry (Vukic 2022).

The case of English in Africa and South Africa

English language proficiency is crucial for journalism, especially in the African and South African context, because of its dominant role in media and education. Studies by Nkoala (2020) and Taylor and Von Fintel (2016) highlight how language use in journalism education can impact students’ perceptions, including feelings of exclusion and inadequacy if they are not proficient in English. This can hinder their ability to learn and participate effectively, even if they excel in other journalistic skills (Meier & Hartell 2009). Fosu (2011) further emphasises the importance of English proficiency by pointing out the prevalence of grammatically flawed writing in African newspapers, suggesting a gap between expected skills and the reality of media practice. This can limit the effectiveness of communication and hinder the media’s ability to serve the public sphere (Fosu 2011). Additionally, research by Madinga et al. (2016) and Ramburuth and Tani (2009) highlights the negative impact of language barriers on students’ information gathering, learning abilities, and confidence. This underscores the need for journalism education to address language proficiency as a critical aspect of preparing future journalists, especially in contexts such as South Africa where English is the primary language of instruction (Madinga et al. 2016).

Several studies emphasise the critical need for improvement in language skills within journalism education, particularly concerning English proficiency in African contexts. Scholars such as Fosu (2011) advocate for a re-evaluation of English language proficiency and training in journalism schools to equip students with the necessary skills for effective communication and professional success. Muswede (2016) emphasises the need for a comprehensive approach, focusing not only on language but also on technical know-how and a broad knowledge base. This holistic approach is crucial for preparing future journalists to navigate the complexities of the contemporary media landscape.

Admissions assessments in the Modern University: Balancing efficiency and fairness

Universities worldwide employ various assessments to select students for their programmes.

These assessments, ranging from high school transcripts and standardised tests to in-depth selection interviews, aim to predict student success. However, their effectiveness and potential biases are important considerations in the modern university landscape.

High school grades and national exams offer standardised ways to compare applicants across different educational systems (Silva et al. 2020). However, they may not capture non-academic factors such as motivation and effort, potentially disadvantaging students from underprivileged backgrounds (Silva et al. 2020). Standardised tests such as the Scholastic Assessment Test (SAT) and American College Testing (ACT) aim to assess general cognitive abilities, but critics argue they can be biased against specific demographics and overlook diverse talents (Zwick 2019). Interviews, on the other hand, offer a more personalised evaluation, allowing assessors to consider interpersonal skills and potential beyond grades, and have been shown to be valuable tools in programmes such as teacher training and journalism (Levy-Feldman & Libman 2020).

Studies suggest that a combination of assessments can be more accurate than relying solely on one measure (Levy-Feldman & Libman 2020). However, the effectiveness of these tools in predicting success in specific fields such as journalism remains unclear. A significant challenge lies in ensuring fairness and inclusivity. Standardised assessments may disadvantage students from underprivileged backgrounds because of limited access to quality education and resources (Silva et al. 2020). Additionally, focusing solely on standardised measures might overlook crucial skills and experiences essential for certain fields, such as critical thinking, creativity, and communication skills valued in journalism education (Moreira & Lago 2017).

There is a need for a balanced approach that considers both standardised assessments and individual strengths and experiences. This could involve incorporating portfolios showcasing relevant work, essays demonstrating writing skills and critical thinking, and extracurricular activities that highlight diverse skillsets and experiences into the evaluation process. Studies in Portugal, for example, suggest that this multifaceted approach can lead to fairer admissions practices (Silva et al. 2020).

In the specific context of journalism education, a holistic approach that considers critical thinking, creativity, and communication skills alongside other factors, such as portfolios and interviews, could be crucial for identifying individuals with the potential to succeed in this dynamic field (Fosu 2011).

Students’ preferences

Students’ preferences may exert a significant influence on the selection process for academic courses. The admission interview, particularly if applied, often emerges as a decisive tool, serving as a statistically significant predictor of enrolment and a crucial factor in forecasting students’ grades (Levy-Feldman & Libman 2020). This could be seen as a preference-based approach and often proves valuable for borderline candidates, allowing for a more nuanced evaluation that goes beyond traditional matriculation and psychometric admission tools. The interview tool facilitates a thorough discussion with and about candidates who may not strictly meet grade admission requirements, but possess other commendable qualities, such as drive and enthusiasm, emphasising the role of student preferences in shaping a holistic selection process (Levy-Feldman & Libman 2020). This emphasis on preferences was shown to extend globally, as seen in Ghana, where admission criteria based on the West African Senior Secondary School Certificate Examinations (WASSCE) English pass reflect a consideration of student language competence as well as other preferences (Fosu 2011). Discussions on the decline of English proficiency in journalism education in Ghana further underscore the importance of students’ preferences, prompting calls for stringent admission measures to ensure the enrolment of quality students (Fosu 2011).

Across diverse educational contexts, the recognition of student preferences becomes a central theme, emphasising the need for a comprehensive and inclusive admission process that aligns with the diverse qualities and aspirations of prospective students. Furthermore, the studies highlight the implicit link between students’ preferences and intangible qualities such as enthusiasm and attitude, suggesting that a nuanced consideration of these factors could serve as valuable criteria in the selection process, ensuring that candidates not only meet academic requirements but also bring a positive and proactive approach to their academic journey (Fosu 2011; Levy-Feldman & Libman 2020).

Writing samples, portfolios, and situational judgement tests as alternative assessment tools

Writing Sample Assessments are commonly used by English and English as a Second Language (ESL) instructors as diagnostic tools at the beginning of a course. These assessments serve two primary functions: adjusting student placements and informing instructors of students’ skill levels. For instance, some colleges use writing samples to place students in the appropriate track, while others use them to identify students who might struggle in the course and recommend alternative classes such as ESL courses (Bunch et al. 2011). This approach allows instructors to tailor their teaching methods to the skill levels of their students, thereby enhancing the learning experience and ensuring that students are adequately prepared for the course material.

Portfolio reviews represent a collaborative and comprehensive approach to admissions that involve both admissions and academic affairs. The process typically includes the submission of various student works assembled according to specific guidelines and evaluated using standardised rubrics. These rubrics are frequently updated and standardised by student services in consultation with faculty to ensure consistency and fairness (DeDominicis & Zabolotney 2020).

Portfolio reviews may be conducted individually or in teams. Faculty members review application materials in advance, including multimedia components such as video responses, and then convene to rank applicants. This method not only evaluates academic qualifications but also provides insight into the applicants’ intentions and readiness for the programme (Willis & Martinez 2023).

The holistic nature of portfolio reviews allows for the differentiation of academically indistinguishable applicants and broadens the range of experiences that can be included in an application. It enables institutions to capture multiple forms of excellence and ensure that applicants are intentional about their pursuit of a degree (Willis & Martinez 2023). Moreover, the detailed feedback provided through portfolio reviews offers valuable insights into programme culture, pedagogy, and community, which can attract prospective students (Willis & Martinez 2023).

Situational Judgement Tests (SJTs) are widely used in personnel selection and promotion because of their ability to present realistic job-related scenarios to assessors. These tests measure constructs related to making judgements in challenging situations by having examinees evaluate response options to problem scenarios (Whetzel, Sullivan & McCloy 2020). The SJTs consist of two main components: a scenario describing a situation and a set of plausible response options. The format and instructions for SJTs can significantly affect the constructs measured and the scores obtained (Whetzel et al. 2020).

The SJTs can be conceptualised as either low-fidelity simulations, which emphasise contextualised knowledge, or as measures of more general domain knowledge. Research indicates that the inclusion of situation descriptions in SJTs does not always significantly affect performance, suggesting that SJTs can measure both context-dependent and context-independent knowledge (Krumm et al. 2015).

The flexibility of SJTs allows them to assess a wide range of constructs, including job knowledge and skills, applied social skills, and basic personality tendencies (Krumm et al. 2015). This versatility makes them suitable for various selection contexts, from assessing technical skills in specific professions to evaluating general social and leadership skills (Christian, Edwards & Bradley 2010).

Writing Sample Assessments, Portfolio Reviews, and SJTs each offer unique advantages as admissions tests. Writing Sample Assessments provide a straightforward method for assessing language proficiency and academic readiness. Portfolio Reviews offer a comprehensive and collaborative approach that captures a wide range of applicant qualities and experiences. Situational Judgement Tests, with their ability to measure judgement and decision-making in realistic scenarios, provide valuable insights into applicants’ practical skills and knowledge. Each method has its strengths and can be effectively used depending on the specific goals and context of the admissions process.

Research methods and design

The literature review underscores the pivotal role of language proficiency, specifically English, within the African context, as imperative for success in journalism. Nevertheless, the efficacy of conventional admissions assessments in forecasting success in specialised fields such as journalism remains ambiguous. This study aims to bridge this gap by adopting a non-experimental quantitative design to scrutinise the correlation between current selection methods (including Matric English grades, entry tests, and interviews) and student performance in first- and second-year English modules at a South African UoT.

Study design

This study employed a non-experimental quantitative design to ascertain the correlation between Matric English grades, entry tests, and selection interviews and the grades achieved in first- and second-year English modules. A case study strategy was adopted, utilising secondary data collected during the admission process. These data were chosen because it was readily available and was expected to provide answers to the research questions. The units of analysis are, therefore, the students’ matric symbols, results of departmental admission tests and interviews. These were compared to grades obtained by journalism students for English (A Level) during their first and second year of study.

Population, sampling, and data collection

The sample population was predetermined using results obtained from all second-year journalism students at a South African UoT from the years 2016 to 2019, constituting the total population sample. The grades obtained by students in English I and II were used as data against which the initial predictors were compared. The group comprised 221 students of both genders from various socio-economic backgrounds, cultures, feeder areas, and different L1s. This included 59 students from 2016, 80 from 2017, 33 from 2018, and 50 from 2019. These year groups were selected because, following 2019, the coronavirus disease 2019 (COVID-19) pandemic may have potentially made comparisons difficult.

The selection test, included among the secondary data analysed, primarily assessed language proficiency through grammar, vocabulary, and comprehension exercises. Semi-structured interviews, included among the secondary data analysed, typically lasted approximately 30 min, and were conducted to evaluate candidates’ communication skills, language fluency, and alignment with core journalistic competencies. These interviewers followed a standardised protocol, ensuring consistent assessment across all candidates.

Data analysis

Pearson’s correlation coefficient was used to determine the relationship between the sets of data. Correlation coefficients measure the strength of the relationship between variables after a linear regression has shown the relationship between the variables. The equation is then used to make predictions about the data.

In the broadest sense, correlation ‘is the measure of an association between variables’ (Schober, Boer & Schwarte 2018:1763). When analysing correlated data, ‘the change in the magnitude of 1 variable is associated with a change in the magnitude of another variable’ (Schober et al. 2018:1763). The letter ‘r’ is typically used to represent the relationship between two variables ‘with a number that varies between -1 and +1’ (Akoglu 2018:91). Zero indicates that there is no correlation and 1 indicates a perfect correlation. The direction of the correlation is indicated by the sign of the r. Where the sign is negative, the variables are shown to be inversely related and the ‘strength of the correlation increases both from 0 to +1, and 0 to -1’ (Akoglu 2018:91, 264–266). A positive relationship is when variables correspond positively to one another – high to high, low to low, and a negative relationship is when variables correspond negatively – high to low and low to high (Maree & Pietersen 2016). In the study, the following relationships were tested to determine their correlation:

  • The relationship between the English matric symbol and performance in English I and II
  • The relationship between the result of the selection test and the performance in English I and II
  • The relationship between the result of the selection interview and the performance in English I and II
Ethical considerations

The researchers received final ethical approval from the academic institution’s research ethics committee, which is a registered Institutional Review Board (IRB 00005968) with the US Office for Human Research Protections (IORG# 0004997). This committee holds Federal Wide Assurance for the Protection of Human Subjects for International Institutions (FWA 00011501). In South Africa, it is registered with the National Health Research Ethics Council (REC-160509-21). Ethical clearance to conduct this study was obtained from the Tshwane University of Technology Research Ethics Committee (No. REC2022/09/010).

Results

The study on which this article reports, aimed to determine the extent to which the selection methods utilised by the Department of Journalism at a South African UoT are effective in enrolling students who would be successful in their studies and produce work of an acceptable standard once they enter the journalism profession. To achieve this goal, the study considered the following questions to be pertinent:

  • To what extent are matric symbols for English, the departmental admission test and selection interview reliable predictors of academic performance in English among journalism students at a UoT?
  • What is the correlation between English matric symbols and English I and II results of journalism students at a UoT?
  • What is the correlation between the departmental admission test and selection interview and the English I and II results of journalism students at a UoT?
  • Comparing the characteristics of the above types of assessments, which is the most accurate predictor of academic success in English among journalism students at a UoT?

When describing the correlation between Matric English grades, results of the selection interview and the selection test and the results achieved in the English I and II modules, a conventional approach to interpreting the Pearson’s correlation coefficient was followed. A correlation coefficient of between 0.00 and 0.10 is interpreted as negligible, a correlation coefficient of between 0.10 and 0.39 is interpreted as weak, a correlation coefficient of between 0.40 and 0.69 is interpreted as moderate (Schober et al. 2018:1765). Because total population sampling was employed to determine the Pearson’s correlation coefficient between matric results for English and the results achieved in the English I and English II modules, it was not necessary to conduct a t-test or calculate the p-value to determine statistical significance. To describe the variables, the mean, standard deviation, and minimum and maximum grades were calculated.

Matric English and how it correlates with English I and II

Table 1 presents descriptive statistics regarding the performance of students in Matric English and its correlation with their performance in English I and English II modules. Additionally, it includes data on selection tests and interview results. Table 1 provides insights into the mean, standard deviation, as well as the minimum and maximum grades obtained across different years for each component. These statistics offer a comprehensive overview of the relationship between Matric English scores and subsequent performance in English I and II, shedding light on the effectiveness of current selection methods within the Department of Journalism at a South African UoT.

TABLE 1: Descriptive states pertaining to the mean, the standard deviation and the minimum and maximum grades obtained for English I and English II modules, selection tests, as well as the interviews.

Table 2 presents the Pearson’s Correlation Coefficient between Matric English grades and the grades achieved for both the English I and English II modules over a 4-year period. The correlation coefficients ranged from 0.32 to 0.37 for English I and exhibited weaker correlations for English II, ranging from 0.03 to 0.39. These coefficients suggest a modest relationship between Matric English grades and performance in the English I module, while the relationship with English II grades appears to be more varied. Furthermore, Table 2 includes correlations between Matric English, selection tests, and selection interview results, offering insights into their associations with performance in English I and English II modules.

TABLE 2: Pearson’s Correlation Coefficient between Matric English, selection tests, and selection interview results and English I and English II modules.

The Pearson’s Correlation Coefficient between Matric English grades and the grades obtained for the English II module fluctuated between 0.03 and 0.39 across the 4 years of data collection, averaging at 0.30. This suggests a weak correlation between students’ grades in Matric English and their performance in the English II module. Notably, the correlation between Matric English and English II is strikingly like the correlation between Matric English and English I, with the latter exhibiting only a slightly stronger correlation.

Selection test and how it correlates with English I and English II

The grades attained on the selection tests ranged from 54.73% to 62.04%, with an overall mean of 58.02%. The standard deviation of the selection test varied between 6.03 and 9.51, yielding an overall standard deviation was 8.96 for the total population.

For the English I module, grades ranged from 60.00% to 66.19%, with an average score of 63.04%. The standard deviation for English I grades fell within the range from 6.87 to 8.81, resulting in an overall standard deviation of 8.17 for the total population.

Similarly, in the English II module, grades ranged from 54.39% to 59.48%, with an overall mean of 58.22%. A standard deviation for English II scores ranged from 7.19 to 11.62, resulting in an overall standard deviation of 7.69 for the total population.

The Pearson’s Correlation Coefficient between the selection test grades and the grades achieved for the English I module varied from 0.08 to 0.33 over the 4 years of data collection, with an average correlation of 0.2. These results indicate a weak correlation between the scores obtained on the selection tests and the grades achieved in the English I module.

Similarly, the Pearson’s Correlation Coefficient between the selection test scores and the grades obtained for the English II module ranged from 0.08 to 0.29 over the 4 years of data collection, with an average correlation of 0.2. This also suggests a weak correlation between the scores on the selection tests and the scores achieved in the English II module.

Selection interview and how it correlates with English I and II

The grades achieved for selection interviews ranged from 68.42% to 71.74% with an overall mean of 69.94%. The standard deviation of the interviews ranged between 8.48 and 10.42. The overall standard deviation was 9.05 for the total population.

The grades achieved for the English I module ranged from 60.00% to 66.19% with an overall mean of 63.04%. A standard deviation ranging from 6.87 to 8.81 was determined with a standard deviation of 8.17 for the total population.

The grades achieved for the English II module ranged from 54.39% to 59.48% with an overall mean of 58.22%. A standard deviation ranging from 7.19 to 11.62 was determined with a standard deviation of 7.69 for the total population.

The Pearson’s correlation coefficient between interview results and the grades achieved for the English I module ranged from 0.06 to 0.6 over the 4 years in which data were collected, with an average correlation of 0.11. This indicates a weak correlation between the grades achieved by the students for interviews and the grades achieved for the English I module.

The Pearson’s correlation coefficient between interview results and the grades achieved for the English II module ranged from 0.05 to 0.17 over the 4 years in which data were collected, with an average correlation of 0.092. This indicates a weak correlation between the grades achieved by the students for the interviews and the grades achieved for the English II module.

Addressing the research questions: Correlation between selection methods and student performance

The findings address the research questions outlined earlier in the study. To determine the extent to which matriculation English grades, selection tests, and selection interviews predict student success in English I and II modules, correlation analyses were conducted. These results suggest that traditional selection methods may not accurately predict student success in journalism programmes.

Specifically, research question one and two were answered by the weak correlations found between matriculation English grades and selection test results, respectively, with performance in both English I and II modules. Similarly, research question three revealed a weak correlation between selection interview scores and student performance in the English modules.

Regarding research question four, while there were slight variations in correlation coefficients across the predictors, none emerged as a significantly stronger predictor of student success than the others. This suggests that a more holistic approach to student selection may be necessary to identify candidates with the potential to succeed in journalism.

Discussion

The identified lack of correlation resonates with the broader body of research, emphasising the inherent limitations of relying solely on single, standardised assessments, especially in the intricate and dynamic academic field of journalism. Such traditional assessments often fall short in encapsulating the multifaceted nature of student potential, overlooking vital dimensions such as critical thinking, research skills, creativity, and ethical judgement—integrated aspects crucial for success in the ever-evolving landscape of journalism.

However, this study transcends the acknowledgment of challenges and limitations, offering innovative solutions for the refinement of the selection process. These proposals advocate for the seamless integration of alternative assessments alongside the existing methods, aiming to create a more comprehensive and nuanced approach to candidate evaluation. These alternative assessments provide valuable insights into various facets of a candidate’s potential, offering a more holistic perspective that goes beyond conventional metrics.

Writing sample assessments

This component involves the evaluation of pieces that replicate real-world journalistic tasks, such as news summaries and short articles. By directly assessing relevant research, analysis, and communication skills, this method provides a tangible measure of a candidate’s aptitude for the intricacies of journalistic work.

Portfolio reviews

The examination of a student’s previous work offers a deep dive into their writing style, voice, and the ability to translate ideas into engaging content. This method goes beyond academic performance, shedding light on a candidate’s creative prowess and their capacity to resonate with an audience – a fundamental skill for journalists in a digital age. Portfolio reviews, as collaborative efforts between admissions and academic affairs, provide a comprehensive and context-rich evaluation of the applicant’s skills and intentions (DeDominicis & Zabolotney 2020; Willis & Martinez 2023).

Situational judgement tests

Employing scenarios that simulate real-world ethical dilemmas or journalistic situations, this assessment adds a layer of complexity by evaluating a student’s decision-making skills under pressure and their understanding of journalistic principles. It provides valuable insights into how a candidate might navigate the ethical challenges inherent in journalism. These tests measure a broad range of constructs, including applied social skills and job knowledge, making them a versatile tool in the admissions process (Krumm et al. 2015; Whetzel et al. 2020).

Implementing these alternative assessments alongside existing procedures holds the potential to create a multifaceted selection process. Such a process would not only provide a more comprehensive understanding of each candidate but also reveal their aptitude for crucial journalistic skills and ethical understanding. This innovative approach aligns with the evolving demands of the journalism profession, ensuring that the selected candidates are not only academically prepared but also possess the diverse skill sets required for success in the contemporary media landscape.

The findings of this study align with the existing literature highlighting the limitations of traditional selection methods in predicting student success in journalism. The weak correlations between matriculation English grades, selection tests, and interviews with academic performance underscore the need for a more comprehensive approach. Writing sample assessments, portfolio reviews, and situational judgement tests offer promising alternatives for evaluating candidates’ abilities in research, critical thinking, communication, and ethical decision-making – all essential competencies for journalism. By incorporating these strategies, journalism programmes can better identify students with the potential to excel in the profession.

To enhance the practical application of the proposed alternative assessments, consider the following examples. Writing samples could include tasks such as crafting a news article based on a provided press release or conducting a short interview and writing a subsequent news story. Portfolio reviews might assess a collection of student work demonstrating their ability to write in different genres (e.g., news, features, opinion pieces) and across various platforms (e.g., print, online, social media). Situational judgement tests could present candidates with ethical dilemmas related to journalistic integrity, such as conflicts of interest or handling sensitive sources.

Limitations

It is important to acknowledge the study’s limitations. While the total population sampling ensured a comprehensive analysis within the specific institution, the generalisability of the findings is limited because of the study focusing on a single South African UoT. In addition, the research only examined correlations with English I and II performance, ignoring broader aspects of journalism education.

While this study primarily focused on the relationship between selection methods and English module performance, it is acknowledged that factors such as student motivation, socio-economic background, and prior academic experiences may influence academic outcomes. Future research could explore these variables in greater depth to provide a more comprehensive understanding of student success in journalism programmes.

Future research could address these limitations by:

  • Expanding the scope to include multiple institutions, allowing for broader generalisability of the findings
  • Exploring correlations with other aspects of journalism education, including practical skills, critical thinking, and research abilities
  • Investigating the relationship between other potential selection criteria and student success in journalism.

By addressing these limitations and continuously refining selection methods, journalism programmes can improve their ability to identify and nurture students with the skills, knowledge, and ethical grounding necessary to succeed in the demanding and dynamic profession of journalism.

Conclusion

This study provides valuable insights into the effectiveness of the selection methods employed by the Department of Journalism at a prominent South African UoT in predicting students’ academic success in English modules. The identified weak correlation between Matric English grades, selection test results, and selection interview outcomes, and the performance in English I and II modules among journalism students underscores the limitations of the current selection criteria. These findings necessitate a critical re-evaluation and enhancement of the existing methods to ensure a more precise assessment of students’ potential within the field of journalism.

Looking ahead, it is imperative for the Department of Journalism to proactively refocus its selection process to align with the ever-evolving demands of the journalism profession. A strategic shift towards a more comprehensive and robust approach is recommended. This involves integrating alternative assessments, such as writing samples, portfolio reviews, and situational judgement tests, alongside traditional metrics. Such an approach would offer a holistic evaluation of candidates, capturing not only their language proficiency but also their critical thinking, creativity, and ethical judgement – attributes crucial for success in journalism.

Embracing this holistic perspective in journalism departments and within higher education institutions is crucial for ensuring that enrolled students are adequately equipped to thrive both academically and professionally in the dynamic field of journalism. The study advocates for a proactive adaptation to the changing landscape of journalism, acknowledging that a multifaceted selection process is pivotal in identifying individuals with the diverse skill sets demanded by the profession.

Furthermore, the study highlights the importance of ongoing research in this domain to refine and improve selection methods across various institutions. Collaborative efforts within the academic community can contribute to the development of standardised best practices for journalism education admissions, ensuring consistency and fairness while selecting candidates with the potential to become competent and skilled journalists. Continuous refinement and innovation in selection methods will play a vital role in preparing journalism students for the challenges and opportunities presented by the evolving media landscape.

Acknowledgements

The authors would like to acknowledge the time invested by the design team in contributing to the formulation of the design principles presented in this article.

This article is partially based on WC Van Wyk’s dissertation entitled Reliability of success predictors for English among journalism students at a university of technology towards the degree of Master of Technology; Language Practice in the Department of Applied Languages, Tshwane University of Technology, South Africa on August 2023, with supervisor(s) GW Collins and MM Swanepoel.

Competing interests

The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.

Authors’ contributions

This article is extracted from W.C.v.W’s. dissertation, supervised by G.W.C. and M.M.S.

Funding information

This research received no specific grant from any funding agency in the public, commercial, or not for profit sectors.

Data availability

The data that support the findings of this study are available from the corresponding author, G.W.C. upon reasonable request.

Disclaimer

The views and opinions expressed in this article are those of the authors and are the product of professional research. It does not necessarily reflect the official policy or position of any affiliated institution, funder, agency, or that of the publisher. The authors are responsible for this article’s results, findings, and content.

References

Akoglu, H., 2018, ‘User’s guide to correlation coefficients’, Turkish Journal of Emergency Medicine 18(3), 91–93. https://doi.org/10.1016/j.tjem.2018.08.001

Basilan, M.L.J.C. & Padilla, M.A., 2023, ‘Assessment of teaching English Language Skills: Input to digitized activities for campus journalism advisers’, International Multidisciplinary Research Journal 4(4), 118–130. https://doi.org/10.54476/ioer-imrj/245694

Bunch, G., Endris, A., Panayotova, D., Romero, M. & Llosa, L., 2011, Mapping the terrain: Language testing and placement for US-educated language minority students in California’s community colleges, Full Report, University of California, Santa Cruz.

Christian, M.S., Edwards, B.D. & Bradley, J.C., 2010, ‘Situational judgment tests: Constructs assessed and a meta-analysis of their criterion-related validities’, Personnel Psychology 63(1), 83–117. https://doi.org/10.1111/j.1744-6570.2009.01163.x

DeDominicis, J. & Zabolotney, B., 2020, Competency-based assessments-understanding the use of Competency-based assessments in admissions processes, BBCAT, Vancouver.

Fosu, M., 2011, ‘Situating language at the centre of journalism training’, Global Media Journal 5(1), 52–80. https://doi.org/10.5789/5-1-57

Krumm, S., Lievens, F., Hüffmeier, J., Lipnevich, A.A., Bendels, H. & Hertel, G., 2015, ‘How “situational” is judgment in situational judgment tests?’, Journal of Applied Psychology 100(2), 399. https://doi.org/10.1037/a0037674

Levy-Feldman, I. & Libman, Z., 2019, ‘The added value of a new interviewing tool for the selection of candidates for the teaching profession’, Journal of Applied Research in Higher Education 12(2), 330–343. https://doi.org/10.1108/JARHE-01-2019-0011

Madinga, N.W., Maziriri, E.T. & Lose, T., 2016, ‘A qualitative inquiry on the challenges facing international students at institutions of higher learning in Southern Gauteng, South Africa’, Towards Excellence in Educational Practices, Conference proceedings of the South Africa International Conference on Education, held at Manhattan Hotel, Pretoria, South Africa, September 19–21, 2016, pp. 60–71.

Maree, K., & Pieterson, J., 2016, The quantitative research process, First steps in research 2 (2nd edn.), Van Schaik, Pretoria.

Meier, C. & Hartell, C., 2009, ‘Handling cultural diversity in education in South Africa’, SA-eDUC Journal 6(2), 180–192.

Moreira, S.V. & Lago, C., 2017, ‘Journalism education in Brazil: Developments and neglected issues’, Journalism & Mass Communication Educator 72(3), 263–273. https://doi.org/10.1177/1077695817719609

Mulligan, D. & Kirkpatrick, A., 2000, ‘How much do they understand? Lectures, students and comprehension’, Higher Education Research & Development 19(3), 311–335. https://doi.org/10.1080/758484352

Muswede, T., 2016, ‘English as a second language offering in South African High Schools: Implications for quality education among media students’, Towards Excellence in Educational Practices, Conference proceedings of the South Africa International Conference on Education, held at Manhattan Hotel, Pretoria, South Africa, September 19–21, 2016, pp. 101–110.

Nkoala, S.B., 2020, ‘Student perceptions of multilingualism and the culture of communication in journalism studies in higher education’, Reading & Writing-Journal of the Reading Association of South Africa 11(1), 1–9. https://doi.org/10.4102/rw.v11i1.258

Ramburuth, P. & Tani, M., 2009, ‘The impact of culture on learning: Exploring student perceptions’, Multicultural Education & Technology Journal 3(3), 182–195. https://doi.org/10.1108/17504970910984862

Schober, P., Boer, C. & Schwarte, L., 2018, ‘Correlation coefficients: Appropriate use and interpretation’, Anaesthesia & Analgesia 126(5), 1763–1768. https://doi.org/10.1213/ANE.0000000000002864

Silva, P.L., Nunes, L.C., Seabra, C., Reis, A.B. & Alves, M., 2020, ‘Student selection and performance in higher education: Admission exams vs. high school scores’, Education Economics 28(5), 437–454. https://doi.org/10.1080/09645292.2020.1782846

Taylor, S. & Von Fintel, M., 2016, ‘Estimating the impact of language of instruction in South African primary schools: A fixed effects approach’, Economics of Education Review 50, 75–89. https://doi.org/10.1016/j.econedurev.2016.01.003

Vukic, T., 2022, ‘On the threshold of a journalism education studies terminology’, Media Literacy and Academic Research 5(2), 110–131.

Whetzel, D.L., Sullivan, T.S. & McCloy, R.A., 2020, ‘Situational judgment tests: An overview of development practices and psychometric characteristics’, Personnel Assessment and Decisions 6(1), 1. https://doi.org/10.25035/pad.2020.01.001

Willis, L. & Martinez, M.R., 2023, ‘Authentic student work in college admissions: Lessons from the Ross School of Business’, Learning Policy Institute, Palo Alto, CA. https://doi.org/10.54300/756.774

Zwick, R., 2019, ‘Assessment in American higher education: The role of admissions tests’, The ANNALS of the American Academy of Political and Social Science 683(1), 130–148. https://doi.org/10.1177/0002716219843469



Crossref Citations

No related citations found.