I studied more than 100 hours a week to enter medical school. Nine years later, with more than $100,000 in student loans, a medical degree, and publications in eye surgery, I decided to leave clinical medicine and become a full-time educator.
Over the past several years, I’ve worked with thousands of medical students worldwide, observing a startling issue with how we train our future doctors.
Globally, employment rates among new medical graduates are the highest compared with other professions, exceeding 90% in the first year.
However, pursuing a career as a doctor is one of the most challenging pathways. While it’s a societally prestigious profession, medical graduates are globally reported to have higher rates of stress, burnout, career dissatisfaction, and depression than other occupations. Not only do they suffer from mental health issues, but they’re also more likely to make medical errors and have worse job performance.
These are the doctors you rely on when you or your loved ones need an operation. When I left clinical practice, I received dozens of messages asking “Why?”. Yet not a single doctor asked me that. Instead, they asked “How?”.
New Zealand, Australia and many other countries face an epidemic of burnt-out doctors, especially since the emergence of COVID-19. Therefore, insights into the employability trajectories of medical graduates, and the role of higher education (HE) in facilitating them, is essential.
These insights support relevant stakeholders to have better strategies to address the chronic problems of struggling medical graduates.
So, how are medical students trained, and what do their employability trajectories look like?
Quantity over quality in medical training
Medicine equips students with an enormous amount of specialised knowledge and skills. Consequently, medical programs are longer than other bachelor degrees, and medical students tend to spend more time and energy on acquiring vocational knowledge and skills in comparison to other majors.
This knowledge and skills is called “human capital” among employability researchers. Theoretically, acquiring rich human capital should be a significant advantage in enabling graduates to perform their job effectively. However, ironically, despite the heavy focus on specialised training, medical graduates still feel underprepared for clinical practice.
Two following areas have been documented as the main reasons for medical graduates’ under-preparedness for clinical practice.
First, medical programs at some universities are insufficient. The traditional curriculum is heavily theoretical. It’s only recently that medical programs have started embedding “integrated curriculums” that incorporate clinical application at very early stages so graduates could be better-prepared for clinical work and increase their study engagement.
However, due to unclear guidelines on what constitutes effective integration, the significant investment required for curriculum redesign, challenges with effective measurement of integrated curriculums, and general resistance to change, adoption of an integrated curriculum is still not the norm, with considerable variability in its implementation.
Second, unprofessional and ineffective training programs and practices at hospitals create significant hurdles for meaningful learning and practice. Since the primary function of hospitals is as a public service established for treating patients rather than teaching new doctors, systems and structures for facilitating education during these transition years have tended to be poor, with substandard learning environments, and reports of unsatisfactory teaching quality and feedback.
Further, while supervision, mentorship and the learning environment play an essential role in developing competent medical doctors, bullying in hospital workplaces is rampant, with 30% to 95% of junior doctors being bullied, often by senior doctors.
This unhealthy training environment hinders medical graduates from learning from their mentors, and contributes to a high level of burnout and job dissatisfaction.
Doctors convert their wellbeing into yours
Medical programs are academically packed with a high volume of content, practical assessments, internships, the emotionally-charged nature of working with patients, and increasing levels of workload and patient responsibility every year.
There are limited opportunities for medical students to learn what’s beyond their specialisations, such as “psychological capital”, which refers to the skills and competencies to look after one’s wellbeing.
Medical students are constantly stressed by heavy study loads, but have little knowledge and skills to look after their mental health. Consequently, as reported in some studies, about 6% per cohort of medical students drop their course, with 40% attributed to psychological morbidity and recurring themes of failure and despair.
When entering the profession as a doctor, medical graduates are globally reported as having higher rates of burnout and career dissatisfaction than other professions.
Despite these well-documented problems and repeated calls to better-prepare medical students with effective mental strategies, current medical training still doesn’t embed programs to sufficiently support and enhance psychological capital, leading to medical graduates feeling disproportionately underprepared regarding the coping skills they need for life as a doctor.
Hospitals as employers: A non-negotiable career trajectory
Medicine has a combination of high demand for doctors, high demand for medical school acceptance, a tight bottleneck in which most candidates aren’t accepted into a medical program, and a unified employment entity that controls the spaces available and employment positions at the end of training.
As a result, in many countries, including the UK, Australia and New Zealand, public hospitals have a monopsony on medical graduates, with career progression being directly tied to employment in these hospitals. This means that if a medical graduate wants to work as a doctor or progress their career, they have no other option than to settle for what’s given.
This monopsony has been attributed as a systemic enabler for substandard work conditions. For example, the response to declining hospital funding in some areas is simply increasing the patient load and hours for each doctor.
In addition, despite the enormous body of research showing adverse mental health outcomes in medical professionals, hospital management doesn’t typically make time available for doctors to access effective resiliency training. Much of the action taken by hospital leadership is reactive and meagre relative to the published magnitude of the problem.
Interventions are usually at the individual level, which has not only been shown to be more limited than group-level changes, but face criticism for deflecting institutional responsibility. Unsurprisingly, a significant (sometimes dominant) source of job dissatisfaction is hospital management.
I recall several instances of this problematic culture. Requests for support from my colleagues were instead met with managers telling them to be more efficient. When I asked management why the rosters understated the department’s actual schedule by more than 10 hours a week, they replied they would have to pay us more if the rosters were accurate.
A friend was denied leave for their wedding when applying six months in advance because the rosters were unreleased. Later, they were denied because there was no space on the roster, forcing them to terminate their contract and delay their training by three months to take five days off.
For a doctor who wishes to remain a doctor, this environment is the only option.
Should universities teach medical students for employment or employability?
Employability is a contested concept. From the institutional perspective, employability is often equated to short-term employment outcomes, and is believed to be achieved by acquiring human capital. This is reflected by how universities have implemented academically-heavy programs, and often use surveys measuring the employment rates of graduates four to 12 months after their graduation.
However, increasing evidence has shown these approaches fail to prepare students to obtain real employability outcomes, which consist of at least four components – employment outcomes, job satisfaction, wellbeing, and sustainability. To achieve these real employability outcomes, students need to be equipped with a range of resources, of which technical knowledge and specialisations is just one component.
The current medical programs exemplify how universities mainly prepare students for employment outcomes. Students are trained to become specialists in the medical field, which is controlled by the government’s policies to ensure a balance between supply and demand.
As a result, medical graduates often have an extremely high employment rate, but, as discussed above, their employability outcomes are questionable.
Medical graduates often face wellbeing issues and have a high level of job dissatisfaction, but little room for alternative career options. They have virtually no opportunities to negotiate work environments, employer, training, and future career options.
Their lack of adequate preparation compounds with substandard work conditions to produce doctors who suffer from significant mental health challenges and burnout while relentlessly attempting to empathetically care for you and your loved ones.
Unlike most other fields, the way out of this environment is out of the profession.
These long-existing issues explain why although we continue to inject thousands of new doctors into the workforce every year, problems with job performance, shortages, retention, and mental health have never been solved.
It’s time for higher education and other stakeholders to take action. Current medical programs need to diversify their curriculums by at least incorporating resources for medical students to look after their mental health, deal with pressures at work, and even career change, which are becoming even more critical in the post-COVID world.
Integrated curriculums also need to be consistently designed and delivered, as they’re crucial to enhancing clinical practice. Additionally, since hospitals are the primary market where doctors are employed, hospital leadership should work with unions and representatives to make hospitals more supportive, collegial, and productive for junior doctors.
At an individual level, medical students should proactively seek solutions for limitations in their institutions (for example, independently integrating their learning with clinical contexts wherever possible, and leveraging off student groups and seniors).
Finally, because employability is determined by a range of factors outside of human capital – a strong argument recently made by employability researchers – graduates must build non-academic resources through extracurricular programs to have greater control over their trajectories.
I spent several years throughout medical school developing other professional skills that gave me numerous alternative opportunities. Similarly, my peers who thrive in clinical practice do so by leveraging a diverse range of employability skills to manage their work environments.
This article was originally published in Lens by Monash University