top of page

Search Index

314 results found

  • You're not a fraud: battling imposter syndrome in STEM | Scientia News

    It's extremely pronounced in a technical environment Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link You're not a fraud: battling imposter syndrome in STEM Last updated: 22/05/25, 10:01 Published: 17/04/25, 07:00 It's extremely pronounced in a technical environment Background If you work in STEM or take even a keen interest in the field, it’s highly likely you’d have heard of and possibly experienced the term 'imposter syndrome'. Despite the glamorised success stories and carefully curated achievements we see in professional circles and on social media, let’s take a realistic step back - people struggle no matter how qualified they are. It’s okay to admit that, and it’s time we remove the stigma of this common experience. Coming into the Scientist Training Programme, I felt a sense of excitement and pride in my achievement of having even managed to get a place on the programme. As I settled in, this quickly turned into something else – fear, anxiety, worry. Feelings that I may not be good enough or I’m not where I belong. I seemed like the only one in my department without a postgraduate qualification. I began feeling out of place. It was only until I was able to put a label on this feeling – imposter syndrome, that I could take active steps to fix it. So, what is imposter syndrome? Put simply, it's the persistent feeling of self-doubt and inadequacy despite evident success. It makes you question whether you truly deserve your accomplishments, fearing that at any moment, someone will expose you as a fraud. This is extremely pronounced in a technical environment where your success is largely measured by your ability to tackle complex problems. Understanding its purpose While frustrating, imposter syndrome stems from a mechanism designed to keep us grounded and striving for growth. As social beings, we evolved to be highly attuned to hierarchies and belonging, and self-doubt may have once served as a protective mechanism, preventing reckless decisions. However, in today’s world, particularly in STEM fields, this innate caution can turn into chronic self-evaluation. The role of social media Imposter syndrome can be exacerbated through the often-unrealistic lens of social media. As I scroll through various social media platforms, I encounter countless posts showcasing often unrealistically flawless careers. Despite what you see in those 'day in the life' posts, not every STEM professional wakes up at 4am and has a cold shower. Rarely do we see the setbacks, rejections, or moments of self-doubt behind those polished posts, yet they exist for everyone. The distortion of what we see online is undoubtably a catalyst for imposter syndrome, but we can take a sensible step back and look at things through a realistic lens. Comparison truly can be the thief of joy if you let it. Coping strategies The good news is, it’s not all doom and gloom and there are strategies we can employ to handle our mischievous minds. As STEM professionals, sometimes we become isolated in our work, deeply ingrained in fixing a problem and not realising there are countless others to share your thoughts and feelings with. This is something I pushed myself to do and as I reached out to the wider community of trainee scientists, I quickly realised that I wasn’t alone. Almost everyone I had spoken to had shared a similar sentiment of having experienced imposter syndrome to some extent. It is important to remember that imposter syndrome is something that has been a universal experience for a very long time. It is certainly not a feeling that is exclusive to those in the early stages of their career as I surprisingly found out having networked with senior figures in the STEM community. My supervisor – a consultant clinical scientist with over 40 years of experience still experiences imposter syndrome as he tackles new challenges in the ever-evolving world of science. I have found that keeping a journal has been incredibly beneficial in logging my achievements -whether personal or career-related. Having a record of successes, no matter how small, serves as a tangible reminder that progress is being made, even when self-doubt tries to convince me otherwise. But the most effective tool I’ve discovered is something I’m still learning myself - self-compassion instead of self-criticism. It’s easy to be too hard on yourself, especially in STEM, where learning new things daily is the norm. The pressure to always have the right answers can make mistakes feel like failures rather than part of the learning process. But the reality is that growth comes from pushing through discomfort, not from perfection. Learning to extend yourself the same kindness you would offer a friend can make a world of difference in battling imposter syndrome. Reframing its meaning If you have experienced imposter syndrome I do have some good news for you – you’re pushing yourself out of your comfort zone in some way and challenging yourself. That is something to be proud of and its important to realise that experiencing imposter syndrome can sometimes simply be a mandatory byproduct of self-growth. You are exactly where you need to be. Even the greatest of minds can experience imposter syndrome. Albert Einstein himself once remarked: The exaggerated esteem in which my lifework is held makes me very ill at ease. I feel compelled to think of myself as an involuntary swindler. So, remember, you’re not alone in this struggle. When to seek help While imposter syndrome is something that a large majority of people experience, you should know when to seek help. If it manifests into something much more than occasional self-doubt, there is no shame in reaching out for help. Speaking to trusted friends or family about how you’re feeling is crucial to keep your mind in the right place. A qualified therapist will be well equipped to help you deal with imposter syndrome and keep you grounded. There are a wealth of online resources that can be used to help you; such as articles, self-help guides, and professional development communities – including the team here at Scientia News who offer strategies to build confidence and reframe negative thinking. Acknowledging imposter syndrome is the first step, but learning to challenge it is what truly allows you to move forward. And the next time you begin to doubt yourself, take a step back and think about your achievements and how they themselves were born from the ashes of self-doubt. Written by Jaspreet Mann Related articles: My role as a clinical computer scientist / Mental health strategies / Mental health in South Asian communities REFERENCES “Imposter Syndrome: A Curse You Share with EinsteinThesislink « Thesislink.” Thesislink, 10 July 2018, https://thesislink.aut.ac.nz/?p=6630 . NHS Inform (2023) ‘Imposter syndrome’, NHS Inform. Available at: https://www.nhsinform.scot/healthy-living/mental-wellbeing/stress/imposter-syndrome . Mind (2022) ‘Understanding imposter syndrome’, Mind. Available at: https://www.mind.org.uk/information-support/types-of-mental-health-problems/imposter-syndrome/ . Healthline (2021) ‘What is imposter syndrome and how can you combat it?’, Healthline. Available at: https://www.healthline.com/health/mental-health/imposter-syndrome . Psychology Today (2020) ‘Overcoming imposter syndrome’, Psychology Today. Available at: https://www.psychologytoday.com/gb/blog/think-well/202002/overcoming-imposter-syndrome . beanstalk. Feel Like a Fraud? How to Overcome Imposter Syndrome - Employee and Family Resources . 1 Jan. 2023, https://efr.org/blog/feel-like-a-fraud . Ling, Ashley. “3 Ways to Get Past Imposter Syndrome.” Thir.St , 13 Aug. 2024, https://thirst.sg/3-ways-to-get-past-imposter-syndrome/ . Project Gallery

  • Understanding and detecting Kawasaki disease on time | Scientia News

    A rare disease that causes inflammation in the blood vessels Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Understanding and detecting Kawasaki disease on time Last updated: 24/02/25, 11:31 Published: 06/02/25, 08:00 A rare disease that causes inflammation in the blood vessels What is Kawasaki disease? Kawasaki disease is a rare type of vasculitis that damages blood vessels through inflammation and is prevalent in children under the age of five. Kawasaki disease is predominantly found in children of Asian races–mainly in Japan, Korea, Taiwan, and Asian races in the US–and is the leading cause of acquired heart disease in children in most developed countries. What causes Kawasaki disease? There is no known cause of Kawasaki disease, however, studies suggest a link between genetics and the disease, noting a high incidence between siblings and in children with a parental history of Kawasaki disease. Another study provided further evidence of genetic susceptibility, stating that variation in the expression of CASP3 and ITPKC—genes heavily involved in T cell function—leads to an overexpression of T cells.This can be attributed to the inflammatory symptoms of the disease. There are speculations that it may be caused by an airborne agent originating in Central Asia which moves across different geographical regions. This study suggests that through winds, the airborne agent is able to cause Kawasaki disease via infection of the respiratory tract–further investigation is needed regarding this hypothesis. Diagnosing Kawasaki disease Symptoms of Kawasaki disease, which are often accompanied by a fever, are classified into three phases: acute, subacute, and convalescent. The acute phase usually lasts between two to three weeks and symptoms include: Carditis Mucosal inflammation (cracked and dry lips, strawberry tongue, swollen lymph nodes) Polymorphous rash Coronary artery aneurysms The subacute phase also lasts up to three weeks and includes symptoms such as: - Perineal and periungual desquamation - Arthralgia - Myocardial disease The convalescent phase is when most clinical signs dissolve and usually lasts up to three months. It is important to note that while most symptoms clear up during this phase, cardiac issues may still persist in some patients. Misdiagnosing Kawasaki disease is very common as its symptoms are similar to that of many diseases like scarlet fever or toxic shock syndrome. With that being said, confirming its diagnosis is often a case of ruling out these diseases. In addition to identifying symptoms linked to other diseases, conducting laboratory tests such as CRP, CBC, and ESR can help confirm a diagnosis of Kawasaki disease. Additionally, echocardiograms and electrocardiograms can help assess coronary abnormalities as well as overall heart function. Treating Kawasaki disease Following diagnoses, patients are first administered an IVIG and a high dose of aspirin to reduce inflammation as well as eliminate pain, swelling and fever. Patients are then administered lower doses of aspirin which helps prevent blood clotting. Roughly 25% of untreated patients are at a higher risk of developing coronary artery aneurysms and lasting cardiovascular issues in general. This risk drops down to 5% when treated appropriately. IVIG is proven to be effective in treating approximately 85-90% of cases when administered within the first ten days of the illness which is why it is imperative that patients are treated early. X-rays are regularly conducted on patients as they can help visualise blood vessels and potential heart abnormalities that may suggest further complications. It can also observe the effectiveness of treatment over time. Post-recovery, an echocardiogram is recommended periodically to detect any coronary abnormalities that may have developed much later on. Summary Kawasaki disease is a rare disease that causes inflammation in the blood vessels. It normally develops in children under the age of five and is yet to have a known cause. It is often hard to diagnose as its symptoms are similar to that of other diseases, which is why it is important to identify its symptoms (polymorphous rash, mucosal inflammation, desquamation, etc) as well as conduct tests such as CBC, CRP, ESR, an electrocardiogram, etc to help rule out other diseases. It is essential that children with Kawasaki disease are diagnosed and treated early as this can help treat coronary artery aneurysm and prevent lasting coronary and cardiovascular abnormalities. Written by Sherine Latheef Related articles: Sideroblastic anaemia / Blood / Inflammation therapy REFERENCES Onouchi, Y., Ozaki, K., Buns, J.C., Shimizu, C., Hamada, H., Honda, T., Terai, M., Honda, A., Takeuchi, T., Shibuta, S., Suenaga, T., Suzuki, H., Higashi, K., Yasukawa, K., Suzuki, Y., Sasago, K., Kemmotsu, Y., Takatsuki, S., Saji, T. and Yoshikawa, T. (2010). Common variants in CASP3 confer susceptibility to Kawasaki disease. Human Molecular Genetics , 19(14), pp.2898–2906. doi: https://doi.org/10.1093/hmg/ddq176 . Agarwal, S. and Agrawal, D.K. (2017). Kawasaki Disease: Etiopathogenesis and Novel Treatment Strategies. Expert review of clinical immunology , [online] 13(3), pp.247–258. doi: https://doi.org/10.1080/1744666X.2017.1232165 . Wolff, A.E., Hansen, K.E. and Zakowski, L. (2007). Acute Kawasaki Disease: Not Just for Kids. Journal of General Internal Medicine , [online] 22(5), pp.681–684. doi: https://doi.org/10.1007/s11606-006-0100-5 . Oh, J.-H., Cho, S. and Choi, J.A. (2023). Clinical Signs of Kawasaki Disease from the Perspective of Epithelial-to-Mesenchymal Transition Recruiting Erythrocytes: A Literature Review. Reviews in Cardiovascular Medicine , 24(4), pp.109–109. doi: https://doi.org/10.31083/j.rcm2404109 . Team, H.J. (2018). Kawasaki Disease - Causes, Signs, Symptoms,Treatment . [online] Health Jade. Available at: https://healthjade.com/kawasaki-disease/ . Project Gallery

  • How Gorongosa National Park went from conflict to community | Scientia News

    A restored wildlife reserve in Mozambique Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link How Gorongosa National Park went from conflict to community Last updated: 24/04/25, 11:42 Published: 27/03/25, 08:00 A restored wildlife reserve in Mozambique This is article no. 5 in a series on animal conservation. Next article: Emperor penguins, the kings of the ice . Previous article: Pangolins: from poached to protected . Gorongosa National Park was the centre of a dark time in Mozambique’s history, which led to most mammals being hunted and entire species going locally extinct. Over the last 20 years, a public-private collaboration has restored many of these species and made Gorongosa National Park a healthy ecosystem again. In this article, I explore what nearly wiped out Gorongosa’s mammals and how they are doing today. About Gorongosa National Park Gorongosa National Park is a wildlife reserve in Mozambique containing grasslands, savannahs, woodlands, and wetlands. It lies at the East African Rift's southern end, making for a complex geological landscape centred around Lake Urema ( Figure 1 ). Lake Urema and the rivers draining into it support a high diversity of herbivorous mammals like elephants, zebras, and antelopes. Civil war and subsequent recovery Gorongosa National Park has illustrated the connections between society and ecology for decades. Civil war raged in Mozambique between 1977 and 1992. During this war, both sides hunted without restrictions in Gorongosa for meat and for valuable animal parts like ivory, which were exported to pay for ammunition. This decreased the population sizes of all animal species in the national park by at least 90%. Twelve years after the war ended, an American non-profit called the Gregory Carr Foundation partnered with the government of Mozambique to conserve and restore Gorongosa. The initiative, now called the Gorongosa Project, aims to bring back mammal species which went locally extinct in the war. In addition to providing healthcare, jobs, and education to 200,000 people living near the national park, the Gorongosa Project invests in tourism and ecological research. Ecologists are interested in how different animal species would rebound from the war and how a diverse ecosystem could be created nearly from scratch. By fostering healthy connections between local communities, scientists, and wildlife after the Mozambican Civil War, the Gorongosa Project has become something special. How different animal species recovered after the Mozambican Civil War Since mammalian herbivores were the cornerstone of pre-war Gorongosa National Park, their recovery has been prioritised. The populations of most herbivores have increased since the Civil War but at varying rates. Waterbucks, a species of antelope, have dominated Gorongosa in the years following the war ( Figure 2 ). This could be because more waterbuck survived the war in the first place and/or because they naturally reproduced faster than other mammals. Stalmans et al. found that waterbucks were found to be growing as fast as they biologically could, as though they had infinite resources and no diseases or predators. Meanwhile, the populations of larger herbivores like hippos, buffaloes, and elephants, which used to dominate Gorongosa, are recovering much slower than waterbucks ( Figure 2 ). With this change in the herbivore community came changes in vegetation. According to Daskin et al., the amount of land covered by trees in Gorongosa increased by 34% between 1977 and 2012 ( Figure 3 ). This was because there were fewer elephants or other ‘browsing’ herbivores to clear out woody vegetation. Thus, the Mozambican Civil War altered the community structure of herbivorous mammals and plants in Gorongosa National Park. After herbivores showed signs of recovery, scientists turned to restoring carnivorous mammals. Lions were the only carnivores not to go locally extinct during the war, so they recovered fastest. Between 2012 and 2016, Bouley et al. counted 104 lions in Gorongosa – about half the pre-war count. Following the success of lions, wild dogs were introduced from two different South African populations in 2018 and 2019. Over the following three breeding seasons, 82 pups were born, and dogs originally from different populations naturally formed their own packs. Wild dogs and lions prefer different prey and hunt in different habitats within Gorongosa, allowing both carnivores to coexist. This successful restoration of mammalian carnivores completed Gorongosa National Park’s post-war ecosystem. Conclusion After most mammals in Gorongosa National Park were hunted during a civil war, the Gorongosa Project restored a functioning ecosystem by diligently monitoring wildlife and working alongside local people. The park has brought attention to the often neglected non-human impacts of war. Conservationists are optimistic that if Gorongosa National Park’s ecosystem can recover from almost nothing, it is not too late to save other damaged ecosystems. Although Gorongosa’s ecosystem today is dominated by waterbucks, time will tell whether populations of carnivores and larger herbivores will return to their former glory. Written by Simran Patel Related articles: Galapagos tortoises / Vicuna conservation REFERENCES Stalmans, M.E. et al. (2019) ‘War-induced collapse and asymmetric recovery of large-mammal populations in Gorongosa National Park, Mozambique’, PLOS ONE , 14(3), p. e0212864. Available at: https://doi.org/10.1371/journal.pone.0212864 . Daskin, J.H., Stalmans, M. and Pringle, R.M. (2016) ‘Ecological legacies of civil war: 35-year increase in savanna tree cover following wholesale large-mammal declines’, Journal of Ecology , 104(1), pp. 79–89. Available at: https://doi.org/10.1111/1365-2745.12483 . Bouley, P. et al. (2018) ‘Post-war recovery of the African lion in response to large-scale ecosystem restoration’, Biological Conservation , 227, pp. 233–242. Available at: https://doi.org/10.1016/j.biocon.2018.08.024 . Bouley, P. et al. (2021) ‘The successful reintroduction of African wild dogs (Lycaon pictus) to Gorongosa National Park, Mozambique’, PLOS ONE , 16(4), p. e0249860. Available at: https://doi.org/10.1371/journal.pone.0249860 . Gorongosa National Park (2020) Our Mission , Gorongosa National Park . Available at: https://gorongosa.org/our-mission-2/ (Accessed: 8 December 2024). Poole, J. et al. (2023) ‘A culture of aggression: the Gorongosa elephants’ enduring legacy of war’, Pachyderm , 64, pp. 37–62. Available at: https://doi.org/10.69649/pachyderm.v64i.518 . Project Gallery

  • The interaction between circadian rhythms and nutrition | Scientia News

    The effect on sleep on nutrition (nutrition timing) Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The interaction between circadian rhythms and nutrition Last updated: 27/04/25, 11:20 Published: 01/05/25, 07:00 The effect on sleep on nutrition (nutrition timing) The circadian system regulates numerous biological processes with roughly a 24-hour cycle, helping the organism adapt to the day-night rhythm. Among others, circadian rhythms regulate metabolism, energy expenditure, and sleep, for which meal timing is an excellent inducer. Evidence has shown that meal timing has a profound impact on health, gene expression, and lifespan. Proper timed feeding in accordance with the natural circadian rhythms of the body might improve metabolic health and reduce chronic disease risk. Circadian rhythms Circadian rhythms are controlled by the central clock of the brain, which coordinates biological functions with the light-dark cycle. Along with meal timing, circadian rhythms influence key elements of metabolism such as insulin sensitivity, fat storage, and glucose metabolism. When meal timing is not synchronised with the body's natural rhythm, it can cause circadian misalignment, disrupting metabolic processes and contributing to obesity, diabetes, and cardiovascular diseases. Literature has indicated that one should eat best during the daytime, particularly synchronised with the active phase of the body. Eating late at night or in the evening when the circadian rhythm of the body is directed towards sleep could impair metabolic function and lead to weight gain, insulin resistance, and numerous other diseases. Also, having larger meals in the morning and smaller meals later in the evening has been linked to improved metabolic health, sleep quality, and even lifespan. A time-restricted eating window, in which individuals eat all meals within a approximately 10–12 hour window, holds promise for improving human health outcomes like glucose metabolism, inflammation, harmful gene expression, and weight loss ( Figure 1 ). It is necessary to consider the impact of meal timing on gene expression. Our genes react to a number of stimuli, including environmental cues like food and light exposure. Gene expression of the body's metabolic, immune, and DNA repair processes are regulated by the body's circadian clock. Disturbances in meal timing influence the expression of these genes, which may result in greater susceptibility to diseases and reduced lifespan. Certain nutrients, such as melatonin in cherries and grapes, and magnesium in leafy greens and nuts, can improve sleep quality and circadian entrainment. Omega-3 fatty acids in fatty fish and flax seeds also have been shown to regulate circadian genes and improve metabolic functions. Other species Meal timing is quite varied among species, and animals have adapted such that food-seeking behavior is entrained into circadian rhythm and environmental time cues. There are nocturnal animals which eat at night, when they are active ( Figure 2 ). These nocturnal animals have evolved to align their meal time with their period of activity to maximise metabolic efficiency and lifespan. Meal timing is optimised in these animals for night activity and digestion. Humans, and most other animals, are diurnal and consume food during the day. In these animals, consuming most of their calories during the day is conducive to metabolic processes like glucose homeostasis and fat storage. These species tend to have better metabolic health when they are on a feeding regimen that is synchronized with the natural light-dark cycle. Conclusion Meal timing is important in human health, genetics, and life expectancy. Synchronising meal times with the body's circadian rhythms optimises metabolic function, reduces chronic disease incidence, and potentially increases longevity by reducing inflammatory genes and upregulating protective ones. This altered gene expression affects the way food is metabolised and metabolic signals are acted upon by the body. Humans naturally gravitate towards eating during daytime hours, while other creatures have feeding habits that are adaptively suited to their own distinct environmental needs. It is important to consider this science and incorporate it into our schedules to receive the best outcome from an activity that we do not normally think about. Written by B. Esfandyare Related article: The chronotypes REFERENCES Meléndez-Fernández, O.H., Liu, J.A. and Nelson, R.J. (2023). Circadian Rhythms Disrupted by Light at Night and Mistimed Food Intake Alter Hormonal Rhythms and Metabolism. International Journal of Molecular Sciences , [online] 24(4), p.3392. doi: https://doi.org/10.3390/ijms24043392 . Paoli, A., Tinsley, G., Bianco, A. and Moro, T. (2019). The Influence of Meal Frequency and Timing on Health in Humans: The Role of Fasting. Nutrients , [online] 11(4), p.719. Available at: https://www.ncbi.nlm.nih.gov/pubmed/30925707 . Potter, G.D.M., Cade, J.E., Grant, P.J. and Hardie, L.J. (2016). Nutrition and the circadian system. British Journal of Nutrition , [online] 116(3), pp.434–442. doi: https://doi.org/10.1017/s0007114516002117 . St-Onge MP, Ard J, Baskin ML, et al. Meal timing and frequency: implications for obesity prevention. Am J Lifestyle Med. 2017;11(1):7-16. Patterson RE, Sears DD. Metabolic effects of intermittent fasting. Annu Rev Nutr. 2017;37:371-393. Zhdanova IV, Wurtman RJ. Melatonin treatment for age-related insomnia. Endocrine. 2012;42(3):1-12. Prabhat, A., Batra, T. and Kumar, V. (2020). Effects of timed food availability on reproduction and metabolism in zebra finches: Molecular insights into homeostatic adaptation to food-restriction in diurnal vertebrates.Hormones and Behavior, 125, p.104820. Project Gallery

  • Why South Asian genes remember famine | Scientia News

    Famine-induced epigenetic changes and public health strategies in affected populations Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Why South Asian genes remember famine Last updated: 22/05/25, 10:02 Published: 23/01/25, 08:00 Famine-induced epigenetic changes and public health strategies in affected populations Our genes are often thought of as a fixed blueprint, but what if our environment could change how they work? This is the intriguing idea behind epigenetics—a field that shows how our environment, combined with the body’s adaptive responses for survival, can influence gene expression without altering our DNA. In South Asia, famines such as the infamous Bengal Famine of 1943 caused immense suffering, and these hardships may have triggered genetic changes that continue to affect generations. Today, South Asians face an increased risk of developing Type 2 diabetes by age 25, whereas White Europeans generally encounter this risk around age 40. What is driving this difference in risk? This article will explore the science behind these epigenetic changes, their impact on the descendants of famine survivors and how these insights can shape public health, policy, and research. The legacy of historical famines In 1943, the Bengal Famine claimed around three million lives. Nobel laureate Amartya Sen argues that the severity of the famine was not merely a result of prior natural disasters and disease outbreaks in crops. Instead, it was primarily driven by wartime inflation, speculative buying, and panic hoarding, which disrupted food distribution across the Bengal region. Consequently, for the average Bengali citizen, death from starvation, disease, and malnutrition became widespread and inevitable. The impact of the famine extended well beyond the immediate loss of life. Dr Mubin Syed, a radiologist specialising in vascular and obesity medicine, emphasises that these famines have left a lasting mark on the health of future generations. Dr Syed explains that South Asians, having endured numerous famines, have inherited "starvation-adapted" traits. These traits are characterised by increased fat storage. As a result, the risk of cardiovascular diseases, diabetes, and obesity is heightened in their descendants. This tendency towards fat storage is believed to be closely tied to epigenetic factors, which play a crucial role in how these traits are passed down through generations. Epigenetic mechanisms and their impact These inherited traits are shaped by complex epigenetic mechanisms, which regulate gene expression in response to environmental stressors like famines without altering the underlying DNA sequence. DNA methylation, a process involving the addition of small chemical groups to DNA, plays a crucial role in regulating gene expression. When a gene is 'on,' it is actively transcribed into messenger RNA (mRNA), resulting in the synthesis of proteins such as enzymes that regulate energy metabolism or hormones like insulin that manage blood sugar levels. Conversely, when a gene is 'off,' it is not transcribed, leading to a deficiency of these essential proteins. During periods of famine, increased DNA methylation can enhance the body's ability to conserve and store energy by altering the activity of metabolism-related genes. Epigenetic inheritance, a phenomenon where some epigenetic tags escape the usual reprogramming process and persist across generations, plays a crucial role in how famine-induced traits are passed down. Typically, reproductive cells undergo a reprogramming phase where most epigenetic tags are erased to reset the genetic blueprint. However, certain DNA methylation patterns can evade this erasure and remain attached to specific genes in the germ cells, the cells that develop into sperm and egg cells. These persistent modifications can influence gene expression in the next generation, affecting metabolic traits and responses to environmental stressors. This means the metabolic adaptations seen in famine survivors, such as increased fat storage and altered hormone levels, can be transmitted to their descendants, predisposing them to similar health risks. Research has highlighted how these inherited traits manifest in distinct hormone profiles across different ethnic groups. A study published in Diabetes Care found that South Asians had higher leptin levels (11.82 ng/mL) and lower adiponectin levels (9.35 µg/mL) compared to Europeans, whose leptin levels were 9.21 ng/mL and adiponectin levels were 12.96 µg/mL. Leptin, encoded by the LEP gene, is a hormone that reduces appetite and encourages fat storage. Adiponectin, encoded by the ADIPOQ gene, improves insulin sensitivity and supports fat metabolism. Epigenetic changes, such as DNA methylation in the LEP and ADIPOQ genes, have led to these imbalances which were advantageous for South Asian populations during times of famine. Elevated leptin levels helped ensure the body could maintain energy reserves for survival, while lower adiponectin levels slowed fat breakdown, preserving stored fat for future use. This energy-conservation mechanism allowed individuals to endure long periods of food scarcity. Remarkably, these epigenetic changes can be passed down to subsequent generations. As a result, descendants continue to exhibit these metabolic traits, even in the absence of famine conditions. This inherited imbalance—higher leptin levels and lower adiponectin—leads to a higher predisposition to metabolic disorders. Increased leptin levels can cause leptin resistance, where the body no longer responds properly to leptin’s signals, driving overeating and fat accumulation. Simultaneously, reduced adiponectin weakens the body’s ability to regulate insulin and break down fats efficiently, resulting in higher blood sugar levels and greater fat storage. These combined effects heighten the risk of obesity and Type 2 diabetes in South Asian populations today. Integrating cultural awareness in health strategies Understanding famine-induced epigenetic changes provides a compelling case for rethinking public health strategies in affected populations. While current medicine cannot reverse famine-induced epigenetic changes in South Asians, culturally tailored interventions and preventive measures are crucial to reducing metabolic risks. These should include personalised dietary plans, preventive screenings, and targeted healthcare programmes. For example, the Indian Diabetes Prevention Programme showed that lifestyle changes reduced diabetes risk by 28.5% among high-risk individuals. Equally, policymakers must consider the broader societal factors that contribute to these health risks, and qualitative studies highlight challenges in shifting cultural attitudes. Expectations that women prepare meals in line with traditional norms often limit healthier dietary options.Differing perceptions of physical activity can complicate efforts to promote healthier lifestyles. For example, a study in East London found that some communities consider prayer sufficient exercise, which adds complexity to changing attitudes. Facing our past to secure a healthier future As we uncover the long-term effects of environmental stressors like historical famines, it becomes clear that our past is not just a distant memory but an active force shaping our present and future health. Epigenetic changes inherited from South Asian ancestors who endured famine have heightened the risk of metabolic disorders in their descendants. For instance, UK South Asian men have been found to have nearly double the risk of coronary heart disease (CHD) compared to White Europeans. Consultant cardiologist Dr Sonya Babu-Narayan has stated, “Coronary heart disease is the world’s biggest killer and the most common cause of premature death in the UK.” With over 5 million South Asians in the UK alone, this stark reality requires immediate action. We must not only address the glaring gaps in scientific research but also develop targeted public health policies to tackle these inherited health risks. These traits are not relics of the past; they are living legacies that, without swift intervention, will continue to affect generations to come. To truly address the inherited health risks South Asians face, we must go beyond surface-level awareness and commit to long-term, systemic change. Increasing funding for research that directly focuses on the unique health challenges within this population is non-negotiable. Equally crucial are culturally tailored public health initiatives that resonate with the affected communities, alongside comprehensive education programmes that empower individuals to take control of their health. These steps are not just about improving outcomes—they’re about breaking a cycle. The question, therefore, is not simply whether we understand these epigenetic changes, but whether we have the resolve to confront their full implications. Can we muster the political will needed to confront these inherited risks? Can we unite our efforts to stop these risks from affecting the health of entire communities? The cost of inaction is not just measured in statistics—it will be felt in the lives lost and the potential unrealised. The time to act is now. Written by Naziba Sheikh Related articles: Epigenetics / Food deserts and malnutrition / Mental health in South Asian communities REFERENCES Safi, M. (2019). Churchill’s policies contributed to 1943 Bengal famine – study. [online] the Guardian. Available at: https://www.theguardian.com/world/2019/mar/29/winston-churchill-policies-contributed-to-1943-bengal-famine-study . Bakar, F. (2022). How History Still Weighs Heavy on South Asian Bodies Today. [online] HuffPost UK. Available at: https://www.huffingtonpost.co.uk/entry/south-asian-health-colonial-history_uk_620e74fee4b055057aac0e9f . Sayed, M., Deek, F. and Shaikh, A. (2022). The Susceptibility of South Asians to Cardiometabolic Disease as a Result of Starvation Adaptation Exacerbated During the Colonial Famines. [online] Research Gate. Available at: https://www.researchgate.net/publication/366596806_The_Susceptibility_of_South_Asians_to_Cardiometabolic_Disease_as_a_Result_of_Starvation_Adaptation_Exacerbated_During_the_Colonial_Famines#:~:text=This%20crisis%20could%20be%20the,adapted%20physiology%20can%20become%20harmful . Utah.edu . (2009). Epigenetics & Inheritance. [online] Available at: https://learn.genetics.utah.edu/content/epigenetics/inheritance/ . Palaniappan, L., Garg, A., Enas, E., Lewis, H., Bari, S., Gulati, M., Flores, C., Mathur, A., Molina, C., Narula, J., Rahman, S., Leng, J. and Gany, F. (2018). South Asian Cardiovascular Disease & Cancer Risk: Genetics & Pathophysiology. Journal of Community Health, 43(6), pp.1100–1114. doi: https://doi.org/10.1007/s10900-018-0527-8 . Diabetes UK (2022). Risk of Type 2 Diabetes in the South Asian Community. [online] Diabetes UK. Available at: https://www.diabetes.org.uk/node/12895 . King, M. (2024). South Asian Heritage Month: A Journey Through History and Culture . [online] Wearehomesforstudents.com . Available at: https://wearehomesforstudents.com/blog/south-asian-heritage-month-a-journey-through-history-and-culture . Project Gallery

  • Hypertension: a silent threat to global health | Scientia News

    Causes, symptoms, diagnosis and management Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Hypertension: a silent threat to global health Last updated: 13/03/25, 11:38 Published: 13/03/25, 08:00 Causes, symptoms, diagnosis and management Introduction Did you know that hypertension, also known as high blood pressure, is a leading cause of premature death, affecting 1.28 billion adults aged 30-79 worldwide? According to the World Health Organisation (WHO), two-thirds of these individuals live in low and middle-income countries. Despite its widespread prevalence, many people remain undiagnosed as most cases are asymptomatic, and individuals are unaware they have the condition. Hypertension can lead to serious clinical manifestations such as heart disease. It can also cause eye retinopathy, causing vision problems and kidney damage, including proteinuria. It also contributes to vascular contributions like atherosclerosis, leading to stenosis and aneurysms. It also significantly raises the risk of stroke and heart failure (Figure 1 ). Addressing hypertension through early diagnosis, improved access to treatment and lifestyle changes is essential to reducing its global burden. This article aims to explore the causes, diagnosis and treatments. What drives hypertension? Hypertension is characterised by persistently elevated BP in the systemic arteries. Blood pressure is typically presented as a ratio: systolic BP, which measures the pressure on arterial walls during heart contraction, and diastolic BP, which reflects the pressure when the heart is at rest. Hypertension is diagnosed when the systolic blood pressure is 130 mmHg or higher and/or diastolic blood pressure exceeds 80 mmHg based on multiple readings taken over time ( Figure 2 ). In contrast, secondary hypertension occurs only in 5% of cases and is caused by an underlying condition, such as kidney disease, hormonal imbalances, or vascular problems. This form of hypertension is often reversible if the underlying cause is treated. Common causes of secondary hypertension include chronic kidney disease, polycystic kidney disease, hormone excess (such as aldosterone and cortisol), vascular issues like renovascular stenosis and certain medications. Drugs that can cause secondary hypertension include chronic use of non-steroidal inflammatory drugs (NSAIDs), antidepressants and oral contraceptives. Hypertension, regardless of its cause, can be exacerbated by certain health behaviours, including excessive dietary salt, a sedentary lifestyle, heavy alcohol consumption, and diets low in essential nutrients, such as potassium. These factors contribute to the development and worsening of high blood pressure. However, blood pressure can be improved by reversing these behaviours, as well as following a diet rich in fruits and vegetables, which helps to mitigate the negative impact on blood pressure. Spotting hypertension: how it is diagnosed Hypertension is usually detected when blood pressure (BP) is measured during regular checkups. Since it often doesn’t show symptoms, all adults must check their BP regularly. The most common way to diagnose hypertension is by measuring BP several times in a doctor’s office. To get an accurate reading, BP must be measured carefully. Since BP can vary throughout the day, multiple measurements are needed. Doctors have recently started using home BP monitoring (HBPM) and ambulatory BP monitoring (ABPM) to check BP outside of the office. ABPM records BP every 20-30 minutes over 24 hours, while HBPM lets patients measure BP at home. These methods help identify conditions like 'white coat hypertension' (high BP in the doctor’s office but normal at home) or 'masked hypertension' (normal BP at the doctor’s office but high at home). When diagnosing hypertension, doctors also look for other health issues related to high BP, such as heart disease or kidney problems. If high BP is sudden or difficult to control, doctors may suspect secondary hypertension, which is caused by another condition, like kidney disease or hormonal imbalances. A thorough medical history is essential. This includes asking about past BP readings, medications, and lifestyle factors such as smoking and diet. Doctors also check for other risk factors like diabetes or high cholesterol, increasing heart disease risk. A physical exam helps confirm the diagnosis of hypertension and checks for any damage to organs like the heart and kidneys. BP should be measured on both arms and if there's a significant difference in readings, further tests may be needed. If necessary, doctors may also check for conditions like atrial fibrillation or perform ultrasounds to look for heart or kidney problems. Blood tests can also help identify risk factors, confirm or rule out secondary hypertension, and assess overall heart health. Managing hypertension, from lifestyle changes to medications Studies show that weight loss can reduce systolic blood pressure by 5 to 20 mmHg, making it an effective strategy for managing hypertension. However, the exact "ideal" body weight or Body Mass Index (BMI) for controlling blood pressure is not clearly defined, but small weight reductions can make a difference. Reducing salt intake, staying active, and managing sleep apnoea also help. While smoking does not directly raise blood pressure, quitting reduces long-term heart risks. Overall, lifestyle changes alone can cut cardiovascular events by up to 15%. Most national and international guidelines recommend the use of angiotensin-converting enzyme inhibitors (ACE inhibitors), angiotensin II receptor blockers (ARBs), calcium channel blockers (CCBs), and thiazide or thiazide-like diuretics as first-line pharmacological treatments for hypertension. Conclusion Hypertension is a prevalent and often silent condition with serious health consequences, including heart disease, stroke, and kidney failure. Its widespread impact on global health, particularly in low- and middle-income countries, underscores the importance of early diagnosis and proactive management. While lifestyle modifications are crucial in managing blood pressure, medications remain essential for many individuals. By raising awareness, promoting regular blood pressure checks, and ensuring access to both preventative and therapeutic measures, we can reduce the burden of hypertension and improve long-term health outcomes globally. Written by Michelle Amoah Related article: Cardiac regeneration REFERENCES Iqbal, A. M., and Jamal, S. F. (2023). Essential hypertension. In StatPearls [Internet]. StatPearls Publishing. Retrieved from [ https://www.ncbi.nlm.nih.gov/books/NBK539859/ ] Schmieder, R. E. (2010). End Organ Damage In Hypertension. Deutsches Ärzteblatt International. https://doi.org/10.3238/arztebl.2010.0866 Touyz, R. M., Camargo, L. L., Rios, F. J., Alves-Lopes, R., Neves, K. B., Eluwole, O., Maseko, M. J., Lucas-Herald, A., Blaikie, Z., Montezano, A. C., and Feldman, R. D. (2022). Arterial Hypertension. In Comprehensive Pharmacology (pp. 469–487). Elsevier. World Health Organization. (2023). Hypertension. Retrieved [24th January 2025], from https://www.who.int/news-room/fact-sheets/detail/hypertension Project Gallery

  • A love letter from outer space: Lonar Lake, India | Scientia News

    The lunar terrain Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link A love letter from outer space: Lonar Lake, India Last updated: 10/04/25, 10:54 Published: 10/04/25, 07:00 The lunar terrain Around 50,000 years ago, outer space gifted the earth with a crater that formed the foundations of the world’s third largest natural saltwater lake, situated within a flat volcanic area known as the Deccan Plateau. This resulted from a 2 million tonne meteorite tunnelling through the earth’s atmosphere at the velocity of 90,000km/hour and colliding into the Deccan Plateau. As time slipped away, pressure and heat melted the basalt rock tucked underneath the impact, and the accumulation of rainwater filled the crater with water. These foundations curated what is famously known today as the ‘Lonar Lake’. What is unique about the Lonar Lake is that it is the only meteorite-crater formed in basaltic terrain - synonymous to a lunar terrain. Additionally, the remnants bear similarities to the terrestrial composition of Mercury, which contains craters, basaltic rock and smooth plains resulting from volcanic activity. Many speculations have arisen to prove the theory of the crater forming from the impact of a meteorite. One such collaborative study conducted by The Smithsonian Institute of Washington D.C. USA, the Geological Survey of India and the US Geological Survey involved drilling holes at the bottom of the crater and scrutinising the compositions of rock samples sourced from the mining. When tested in the laboratory, it was found that the rock samples contained leftovers of the basaltic rock that were modified from the crater collision under high heat and pressure. In addition, shattered cone-shaped fractures, due to high velocity shock waves being transmitted into the rocks, were identified. These two observations align with the meteorite impact phenomenon. Additionally, along with its fascinating astronomical properties, scientists have been intrigued by the chemical composition of the lake within the crater. Its dark green colour results from the presence of the blue-green algae Spirulina. The water also has a pH of 10, making the water alkaline in nature, supporting the development of marine systems. One explanation for the alkalinity of the water is that it is a result of immediate sulphide formation, where the groundwater of meteorite origin contains CO2 undergoes a precipitation reaction with alkaline ions, leaving a carbonate precipitate with an alkaline nature. What is also striking about the composition of the water as well is its saline nature, which coexists with the alkaline environment - a rare phenomenon to occur in ecological sciences. The conception of the lake, from the matrimony of Earth with the debris within outer space, has left its imprints within the physical world. It's a love letter, written in basaltic stone and saline water, fostering innovation in ecology. The inscription of the meteorite’s journey within the crater has branched two opposing worlds, one originating millions of miles away from humans with one that resides in the natural grounds of our souls. Written by Shiksha Teeluck Related articles: Are aliens on Earth? / JWST REFERENCES Taiwade, V. S. (1995). A study of Lonar lake—a meteorite-impact crater in basalt rock. Bulletin of the Astronomical Society of India, 23, 105–111. Tambekar, D. H., Pawar, A. L., & Dudhane, M. N. (2010). Lonar Lake water: Past and present. Nature Environment and Pollution Technology, 9(2), 217–221. Project Gallery

  • Building Physics | Scientia News

    Implementing established physical theories into the constructions of the future Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Building Physics Last updated: 03/04/25, 10:39 Published: 20/02/25, 08:00 Implementing established physical theories into the constructions of the future From the high rise establishments that paint the expansive London skyline to the new build properties nestled within thriving communities, buildings serve as a beacon of societal needs. The planned and precise architecture of buildings provides shelter and comfort for individuals, as well as meet business agendas to promote modern day living. Additionally, buildings serve a purpose as a form of protection where, according to the World Health Organisation (WHO), the design and construction of buildings is to create an environment suitable for human living: more favourable than the state of the natural environment outdoors construction and building protects us from: extremes of temperature moisture excessive noise To sustain these pivotal agendas, a comprehensive analysis of the physical factors within the environment of buildings, including temperature, light and sound are required for design and legislation for a building to function. The field of ‘Building Physics’ primarily addresses these physical factors to innovate ‘multifunctional solutions’, be more efficient, and build upon present designs, which can be adapted for future use. Moreover, the built environment is regarded as one of the biggest carbon emissions on the planet, so using building physics as an early design intervention can reduce energy consumption and minimise carbon emissions. This supports global manifestos of moving towards net zero and decreasing the likelihood of the detrimental effects caused by climate change. The main components of Building Physics Building Physics is composed of examining the functions of an interior physical environment, including air quality, thermal comfort, acoustics comfort (sound), and light : Air quality: Ventilation is needed for maintaining a safe environment and reducing the quantity of stale air - consisting of carbon dioxide and other impurities - within an interior environment. Air infiltration also contributes to a significant heat loss, where it is important to provide intentional ventilation to increase the efficiency of energy transfers within the building. Thus, good ‘airtightness’ of a building fabric, which can be considered as the building’s resistance to unintentional air infiltration or exfiltration, can enable planned airflows for ventilation. Thermal: The biggest influence within the field of Building Physics stems from an understanding of heat conductivity depending on the density and moisture content of the material, as well as heat transfers - conduction, convection, radiation and transition - to determine the suitability of materials used for construction. For example, a material such as a solid wood panel for walls and ceilings is favourable as it can be installed in layers, providing even temperature fields across the surface. It is important that a building has the ability to isolate its environment from external temperature conditions and have the correct building envelope - a barrier that separates the interior and exterior of a building. Acoustics: A regulated control of sound within buildings contributes towards maintaining habitable conditions for building users to make sure that sound is loud, undistorted, and the disturbances are reduced. Acoustics can be controlled and modified through material choices, such as installing sound-absorbing material. These materials can be adapted to reduce sound leakage, which are common in air openings, such as ventilators and doors, that are more likely to transmit sound than adjacent thicker walls. Light: Light provides an outlook of viewing an environment in an attractive manner, particularly using daylight as a primary source of enhancing the exterior of a building, whilst also functioning within a building. One strategy used to fulfil the purpose of light in buildings is designing windows for the distribution of daylight to a space. The window design has a divisive effect on the potential daylight and thermal performance of adjacent spaces, so it needs to be closely checked using the standardised methods, in order to be suitable for use. Additionally, as windows are exposed to the sky, daylighting systems can adapt windows to transmit or reflect daylight as a function of incident angle, for solar sharing, protection from glare and redirection of daylight. Overall, a key objective of sustaining a safe and eco-friendly building is to ensure that the space has proper heat and humidity aligning with a suitable degree of acoustic and visual comfort in order to sustain the health of the people using the building. Particularly within modern society, a combination of Building Physics principles and digitalised software, such as Building Information Modelling (BIM), can enhance the design process of a building to provide healthy environments for generations to come. Written by Shiksha Teeluck Related article: Titan Submersible REFERENCES Unsplash. A construction site with cranes [Internet]. [Accessed 2 January 2025]. Available from: https://unsplash.com/photos/a-construction-site-with-cranes-mOA2DAtcd1w . Katunský D, Zozulák M. Building Physics . 2012. ISBN: 978-80-553-1261-3. Partel. Building Physics [Internet]. [Accessed 2 January 2025]. Available from: https://www.partel.co.uk/resources/building-physics/#:~:text=According%20to%20WHO%20(World%20Health,%3A%20in%20contrast%2C%20allows%20productions . RPS Group. A day in the life of a senior building physics engineer [Internet]. [Accessed 4 January 2025]. Available from: https://www.rpsgroup.com/insights/consulting-uki/a-day-in-the-life-of-a-senior-building-physics-engineer/ . Cyprus International University. What is Building Physics and Building Physics Problems in General Terms [Internet]. [Accessed 6 January 2025]. Available from: /mnt/data/What_Is_Building_Physics_and_Building_Ph.pdf. Centre for Alternative Technology. Airtightness and Ventilation [Internet]. [Accessed 6 January 2025]. Available from: https://cat.org.uk/info-resources/free-information-service/eco-renovation/airtightness-and-ventilation/#:~:text=With%20good%20airtightness%2C%20effective%20ventilation,won't%20work%20as%20intended . KLH. Building Physics [Internet]. [Accessed 6 January 2025]. Available from: https://www.klh.at/wp-content/uploads/2019/10/klh-building-physics-en.pdf . Watson JL. Climate and Building Physics [Internet]. [Accessed 6 January 2025]. Available from: https://calteches.library.caltech.edu/98/1/Watson.pdf . Ruck N, Aschehoug Ø, Aydinli S, Christoffersen J, Edmonds I, Jakobiak R, et al. Daylight in Buildings - A source book on daylighting systems and components . 2000 Jun. Synergy Positioning Systems. How BIM Saves Time & Money for Construction Businesses [Internet]. [Accessed 6 January 2025]. Available from: https://groupsynergy.com/synergy-positioning-news/how-bim-saves-time-money-for-construction-businesses . Project Gallery

  • Sleep less…remember less: the hidden link between sleep and memory loss | Scientia News

    Not getting enough sleep can increase the risk of developing Alzheimer’s Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Sleep less…remember less: the hidden link between sleep and memory loss Last updated: 07/04/25, 11:44 Published: 17/04/25, 07:00 Not getting enough sleep can increase the risk of developing Alzheimer’s People often don’t get enough sleep for a variety of reasons, ranging from intentional choices like work or study demands (because who needs sleep when you’ve got deadlines, right?), to the growing concern with screen time (a.k.a. the “I’ll just watch one more episode” syndrome), and of course, procrastination (where your brain convinces you that 3 a.m. is a great time to suddenly get productive). But it’s not all fun and games—serious issues like insomnia, sleep apnoea, family responsibilities, or even shift work can also interfere with rest. Sleep disorders are increasingly common, with around one in three people in the UK affected, and they’re particularly prevalent among the elderly. However, not getting enough sleep can increase the risk of developing Alzheimer’s disease (AD). How do sleep disorders impact Alzheimer’s disease? Insomnia is characterised by difficulty falling asleep or staying asleep, which can lead to prolonged fatigue and memory issues. As shown in Figure 1 , people with insomnia tend to have some similarity in markers as those with Alzheimer’s disease, such as an increased level of Aβ and tau proteins in the brain. This is primarily because a lack of sleep prevents the effective removal of harmful products from the brain – this accumulation increases a person’s risk of AD. A plethora of experimental studies on humans and animals have shown that lack of sleep can lead to increased circulating levels of TNF-α and the gene resulting in more TNF-α secretion. This pro-inflammatory cytokine exacerbates AD pathology because neuroinflammation can lead to dysfunction and cell death, which are key markers of AD. Other pro-inflammatory cytokines, like IL-1, have been found to be relevant in the link between sleep deprivation and AD. Overexpression of IL-1 in the brain leads to abnormal changes in nerve cell structures especially relating to Aβ plaques. This highlights IL-1’s key role in plaque evolution and the synthesis of Amyloid Precursor Protein, which promotes amyloid production that eventually results in AD pathology. What type of sleep can impact one’s risk of Alzheimer’s disease? Studies using more objective measures, like actigraphy (which tracks sleep-wake activity), found that sleep quality (sleep efficiency) is more important than total sleep time. For example, women with less than 70% sleep efficiency were more likely to experience cognitive impairment. Increased wakefulness during the night also moderated the relationship between amyloid deposition (a hallmark of AD) and memory decline. Uncertainties… However, it remains unclear whether poor sleep directly causes AD or if the disease itself leads to sleep disturbances. Some studies suggest a bidirectional relationship. Aging itself leads to poorer sleep quality, including reduced sleep efficiency, less slow-wave sleep (SWS), and more frequent awakenings. Sleep disorders like obstructive sleep apnoea, insomnia, and restless legs syndrome also become more common with age. What are the next steps? The good news is that many sleep disorders, including insomnia, are manageable, and improving sleep quality could be a simple yet powerful way to reduce Alzheimer’s risk. Additionally, early diagnosis and treatment of conditions like sleep apnoea and insomnia may help slow or even prevent neurodegenerative changes. s researchers continue to explore the intricate relationship between sleep and Alzheimer’s, one thing is clear: getting a good night’s sleep isn’t just about feeling refreshed. It is a crucial investment in long-term brain health. Written by Blessing Amo-Konadu Related articles: Overview of Alzheimer's / Hallmarks of Alzheimer's / CRISPR-Cas9 in AD treatment / Memory erasure REFERENCES Lucey, B. (2020). It’s complicated: The relationship between sleep and Alzheimer’s disease in humans. Neurobiology of Disease , [online] 144, p.105031. doi: https://doi.org/10.1016/j.nbd.2020.105031 . NHS (2023). Insomnia . [online] www.nhsinform.scot . Available at: https://www.nhsinform.scot/illnesses-and-conditions/mental-health/insomnia/ . Pelc, C. (2023). Not getting enough deep sleep may increase the risk of developing dementia . [online] Medicalnewstoday.com . Available at: https://www.medicalnewstoday.com/articles/not-getting-enough-deep-sleep-may-increase-dementia-risk#Clarifying-the-link-between-sleep-aging-and-dementia-risk [Accessed 22 Dec. 2024]. Sadeghmousavi, S., Eskian, M., Rahmani, F. and Rezaei, N. (2020). The effect of insomnia on development of Alzheimer’s disease. Journal of Neuroinflammation , 17(1). doi: https://doi.org/10.1186/s12974-020-01960-9 . Project Gallery

  • The fundamental engineering flaws of the Titan Submersible | Scientia News

    From the hull to the glass viewpoint- shortcuts in design Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The fundamental engineering flaws of the Titan Submersible Last updated: 03/04/25, 10:27 Published: 03/04/25, 07:00 From the hull to the glass viewpoint- shortcuts in design On June 18, 2023, the Titan submersible made headlines when the expedition to visit the wreck of the Titanic ended in tragedy. In the North Atlantic Ocean, 3,346 metres below sea-level, the underwater vessel catastrophically imploded along with its five passengers. Two years on, this article deep dives into the key points of failure in engineering and reflects on what we can learn from the fatal incident. The Titanic and OceanGate’s mission The Titanic wreck lies around 3800 metres below sea level in the North Atlantic Ocean, approximately 370 miles off the coast of Newfoundland, Canada. Since the wreckage was finally discovered in September 1985, over seven decades after the boat sank from an iceberg collision on the 15th of April 1912, less than 250 people have personally viewed the wreckage. Despite many discussions to raise the wreckage back to the surface, the complete Titanic structure has become too fragile after over a century underwater and will likely disintegrate completely over the next few decades. Hence, viewing the Titanic in person is only possible with an underwater vessel, a feat which has been achieved successfully since 1998 by a range of companies seating historians, oceanographers, and paying tourists. The Titan submersible is one such vessel developed by OceanGate Expeditions. Titan has been attempting dives to the Titanic wreck since 2017 and was first successful in 2021, when it went on to complete 13 successful dives. According to the passenger liability waiver however, this was only 13 out of 90 attempted dives (a 14% success rate), as a result of communication signal failures, structural concerns, strong currents, poor visibility, or logistical issues. On the many failed attempts, the mission was either cancelled or aborted before the Titan reached the depth of the Titanic wreck. Despite concerns raised by engineers, poor success rates in testing and simulation, as well as previous instances of the Titan spiralling out of control, OceanGate continued with their first planned dive of 2023, leading to its catastrophic implosion that claimed five lives. The Titan is the first fatality of a submersible dive to the Titanic. What went wrong: structural design When designing an underwater vessel to reach a certain depth, the body of the vessel called the hull, would need to be capable of withstanding an immense amount of pressure. For 10 metres of depth, the pressure on the submersible’s hull increases by one atmosphere (1 bar or 101kPa). To reach the wreck of the Titanic 3800 metres underwater would require the hull to withstand the pressure of over 38 MPa (see Figure 1 ). For perspective, this is around 380 times the pressure we feel on the surface and about 200 times the pressure of a standard car tyre. Over one square inch, this equates to nearly 2500kg. To withstand such high hydrostatic pressure, a submersible hull is normally constructed with high-strength steel and titanium alloys in a simple spherical, elliptical, or cylindrical shell. At this point we discover some of the key points of failure in the Titan. The Titan’s hull was made from Carbon Fibre Reinforced Plastic (CFRP), i.e., multiple layers of carbon fibre mixed with polymers. Carbon fibre is a high-tech and extremely desirable material for its tensile strength, strength-to-weight ratio, high chemical resistance, and temperature tolerance. The material has proven itself since the 1960’s in the aerospace, military, and motorsport industries, however the Titan was the first case of using carbon fibre for a crewed submersible. At first glance, the use of a carbon fibre hull suggests the advantage of significantly reducing the vessel's weight (50-75% lighter than titanium) while maintaining tensile strength, which will allow for a greater natural buoyancy. Without the need for added buoyancy systems, the hull would be able to hold space for more passengers at one time. As carbon fibre is cheaper than titanium and passengers pay $250,000 a seat, carbon fibre may appear to be a better business plan. However, although carbon fibre performs extremely well under tension loads, it has no resistance to compression loads (as with any fibre) unless it is infused with a polymer to hold the fibres together (see Figure 2 ). The polymer in the CFRP holding the fibres in alignment is what allows the material to resist compressive loads without bending by distributing the forces to all the fibres in the structure. This means the material is an isotropic: it is much stronger in the direction of the fibres than against (the same way wood is stronger along the grain). Therefore, individual layers of the CFRP must be oriented strategically to ensure the structure can withstand an expected load in all directions. A submersible hull intending to reach the ocean floor must withstand a tremendous compressive load, much higher than carbon fibre is typically optimised for in the aviation and automotive racing industries, and carbon fibre under such high compressive load is currently an under-researched field. Although it is likely possible for carbon fibre to be used in deep-sea vessels in the future, it would require rigorous testing and intensive research which was not done by OceanGate. Despite this, the Titan had apparently attempted 90 dives since 2017 and the repeated cycling of the carbon fibre composite at a high percentage of its yield strength would have made the vessel especially vulnerable to any defects reaching a critical level. Upon simple inspection, the Titan also raises other immediate structural concerns. Submersible hulls are usually spherical or slightly elliptical, which would allow the vessel to receive an equal amount of pressure at every point. The unique tube-shape of the Titan’s hull (see cover image) would not equally distribute pressure, and this issue was ‘addressed’ with the use of separate end-caps. The joints that attach the end-caps to the rest of the hull only introduced further structural weaknesses, which made the vessel especially vulnerable to collapsing from micro-cracks. The Titan’s glass viewpoint was another structurally unsound feature [Figure 3]. David Lochridge, the former director of OceanGate’s marine operations between 2015 and 2018 who was fired for raising concerns about the submersible’s safety features, claimed the company that made the material only certified its use down to 1300m (falling over 2000 metres short of the Titanic’s depth). The immense forces on materials without the properties to withstand the compressive pressure made the Titan’s failure inevitable. Cutting corners in the interest of business The foundation of the implosion’s cause was OceanGate’s insistence on cutting corners in Titan’s design to save time and money. The Titan was not certified for deep-sea diving by any regulatory boards and instead asked passengers to sign a waiver stating the Titan was ‘experimental’. As underwater vessels operate in international waters, there is no single official organisation to ensure ship safety standards, and it is not essential to have a vessel certified. However, many companies choose to have their ships assessed and certified by one of several organisations. According to The Marine Technology Society submarine committee, there are only 10 marine vessels capable of reaching Titanic level depths, all of which are certified except for the Titan. According to a blog post on the company website, OceanGate claimed the way that the Titan had been designed fell outside the accepted system - but it “does not mean that OceanGate does not meet standards where they apply”. The post continued that classification agencies “slowed down innovation… bringing an outside entity up to speed on every innovation before it is put into real-world testing is anathema to rapid innovation”. According to former engineers and consultants at OceanGate, the Titan’s pressure hull also did not undergo extensive full-depth pressure testing, as is standard for an underwater vessel. Carbon fibre - the primary material of the Titan’s hull - is extremely unpredictable under high compressive loads, and currently has no real way to measure fatigue. This makes it an unreliable and dangerous material to be used for deep-sea dives. OceanGate CEO Stockton Rush, who was a passenger on the Titan during its last fatal dive in 2023, described the glue holding the submersible’s structure together as “pretty simple” in a 2018 video, admitting “if we mess it up, there’s not a lot of room for recovery”. Having attempted 90 dives with a 14% success rate since 2017, it was inevitable that micro-cracks in the Titan from repeated dives, if not for the extremely sudden failure modes of carbon fibre composites, would result in the vessel's instantaneous implosion. On the 15th of July 2022 (dive 80), Titan experienced a "loud acoustic event" likely form the hull’s carbon fibre delaminating, which was heard by the passengers onboard and picked up by Titan's real-time monitoring system (RTM). Data from the RTM later revealed that the hull had permanently shifted following this event. Continued use of the Titan beyond this event without further testing of the carbon fibre - because the hull was ‘too thick’ - prevented micro-cracks and air bubbles in the epoxy resin from being discovered until it was too late. Another fundamental flaw lies in the Titan’s sole means of control being a Bluetooth gaming controller. While this is not an uncommon practice, especially in the case of allowing tourists to try controlling the vessel once it has reached its location, it is essential that there are robust secondary and even tertiary controls that are of a much higher standard. The over-reliance on wireless and touch-screen control, particularly one operating on Bluetooth which is highly sensitive to interference, was a dangerous and risky design choice. Although it was unlikely to have caused the implosion on its own, cutting corners in the electronics and controls of a vessel that needs to be operated in dangerous locations is irresponsible and unsafe. Submersibles operating at extreme depths require robust fail-safes, including emergency flotation systems and locator beacons. Again, OceanGate cut corners in developing Titan’s emergency recovery systems, using very basic methods and off-the-shelf equipment. In the event of catastrophic failure, the absence of autonomous emergency measures is fatal. With the extent of damage and poor design to the vessel’s carbon fibre hull, it was unlikely that even the most advanced emergency systems could prevent the magnitude of the implosion. Still, the carelessness displayed in almost every aspect of the submersible’s design was ultimately the cause of the fatal Titan tragedy. Conclusion In a 2019 interview, OceanGate’s former CEO Stockton Rush said: There hasn’t been an injury in the commercial sub industry in over 35 years. It’s obscenely safe because they have all these regulations. But it also hasn’t innovated or grown — because they have all these regulations. In the world of engineering, shortcuts can be catastrophic. Whilst risk-taking is undeniably essential to support innovation, Titan’s fatal tragedy was entirely preventable and unnecessary if the proper risk management techniques were employed. OceanGate had the potential to revolutionise the use of carbon fibre in deep-sea industries but consistently cutting corners and not investing in the required real-world testing, as well as the arrogance to ignore expert warnings, is what ultimately led to Titan’s story fatefully echoing the overconfidence of Titanic’s “she is unsinkable!”. Whilst regulations on submersibles tighten and research into carbon fibre is increased, it is important to take the fundamental cause of the tragic implosion as a wake-up call. Assumptions are deadly: trust the science, invest in the proper research, test every bolt, and never underestimate the ocean’s relentless power. Written by Varuna Ganeshamoorthy Related articles: Engineering case study- silicon hydrogel / Superconductors / Building Physics Project Gallery

bottom of page