top of page

Search Index

258 items found

  • Key discoveries in the history of public health | Scientia News

    Go back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Key historical events and theories in public health Introduction ​ Now more than ever, public health has become crucial, which looks at promoting health and preventing disease within a society. There have been numerous events and concepts that have helped shape our current health systems today because without them, it is possible that our health systems would not have advanced without previous knowledge to evolve from. This article will focus on certain key events and concepts. ​ Humoral Theory (Ancient Greek and Roman times) ​ To begin, there was the Humoral Theory, which looked at how disease was caused by gaps in fluids/humours which were: blood, yellow bile, black bile and phlegm, which equated to the elements of air, fire, earth and water respectively. The imbalance can come from habits like overeating and too little/much exercise or external factors such as the weather. This theory was thought to have originated from the Hippocratic Corpus, a compilation of 60 medical documents written during the Ancient Greek era by Hippocrates. Although this theory as we know now is flawed, it did provide a foundational understanding of the human body and was utilised in public health for centuries before being subsequently discredited for the Germ Theory established during the mid-19th century. ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Miasma Theory (Ancient Greek era to the 19th century) ​ Another theory replaced by Germ Theory was the Miasma theory, which stated that diseases like the plague and cholera were spread due to toxic vapours from the ground/decomposing matter. This theory along with the Humoral theory was accepted for thousands of years since the Ancient Greek era. With regards to the cholera outbreaks in the Victorian era, John Snow’s theory of polluted water causing cholera was initially not accepted by the scientific community during his death in 1858. Eventually though, his theory became accepted when Joseph Bazalgette worked to fix London’s sewage to prevent more deaths by cholera. This event with the Germ Theory led to Miasma and Humoral theories to be disproved, although they provided foundational understanding of how diseases spread. ​ ​ The discovery of vaccines (late 18th century) ​ Aside from theories such as the four humours from above, there were concepts or discoveries that advanced public health measures such as vaccination, which eradicated smallpox and is still used today to prevent the severity of diseases such as COVID-19, influenza and polio. The origins of successful vaccines could be traced back to Edward Jenner who in 1796, retrieved samples from cowpox lesions from a milkmaid because he noticed that contracting cowpox protected against smallpox. With this in mind, he inoculated an 8 year old boy and after this, the boy developed mild symptoms, but then became better. Without this event, it is likely that the human population would significantly decrease as there is more vulnerability to infectious diseases and public health systems being weaker or less stable. ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Image of a COVID-19 injection. Germ Theory (19th century) ​ As for current scientific theories relating to public health, there is the widely accepted Germ Theory by Robert Koch during the 19th century in the 1860s, stating that microorganisms can cause diseases. He established this theory by looking at cow’s blood through a microscope to see that they died from anthrax and observed rod-shaped bacteria with his hypothesis that they caused anthrax. To test this, he infected mice with blood from the cows and the mice also developed anthrax. After these tests, he developed postulates and even though there are limitations to his postulates at the time like not taking into account prions or that certain bacteria do not satisfy the postulates, they are vital to the field of microbiology, in turn making them important to public health. ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ The establishment of modern epidemiology (19th century) ​ Another key concept for public health is epidemiology, which is the study of the factors as well as distribution of chronic and infectious diseases within populations. One of epidemiology’s key figures is John Snow, who explored the cholera epidemics in London 1854, where he discovered that contaminated water from specific water pumps was the source of the outbreaks. Moreover, John Snow’s work on cholera earned him the title of the “father of modern epidemiology” along with his work providing a basic understanding of cholera. Therefore, this event among others has paved the way for health systems to become more robust in controlling outbreaks such as influenza and measles. ​ Conclusion ​ Looking at the key events above, it is evident that each of them has played an essential role in building the public health systems today through the contributions of the scientists. However, public health, like any other science, is constantly evolving and there are still more future advancements to look forward to that can increase health knowledge. ​ ​ By Sam Jarada ​ Related articles: Are pandemics becoming less severe? / Rare zoonotic diseases References Lagay F. The Legacy of Humoral Medicine. AMA Journal of Ethics. 2002 Jul 1;4(7). Earle R. Humoralism and the colonial body. Earle R, editor. Cambridge University Press. Cambridge: Cambridge University Press; 2012. Halliday S. Death and miasma in Victorian London: an obstinate belief. BMJ. 2001 Dec 22;323(7327):1469–71. Riedel S. Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University Medical Center). 2005 Jan 18;18(1):21. National Research Council (US) Committee to Update Science, Medicine, and Animals. A Theory of Germs. Nih.gov. National Academies Press (US); 2017. Sagar Aryal. Robert Koch and Koch’s Postulates. Microbiology Notes. 2022. Tulchinsky TH. John Snow, Cholera, the Broad Street Pump; Waterborne Diseases Then and Now. National Library of Medicine. Elsevier; 2018. p. 77–99. ​

  • The Lyrids meteor shower | Scientia News

    Go back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The Lyrids meteor shower The Lyrids bring an end to the meteor shower drought that exists during the first few months of the year. On April 22nd, the shower is predicted to reach its peak, offering skygazers an opportunity to witness up to 20 bright, fast-moving meteors per hour which leave long, fiery trails across the sky, without any specialist equipment. The name Lyrids comes from the constellation Lyra - the lyre, or harp - which is the radiant point of this shower, i.e. the position on the sky from which the paths of the meteors appear to originate. In the Northern Hemisphere Lyra rises above the horizon in the northeast and reaches the zenith (directly overhead) shortly before dawn, making this the optimal time to observe the shower. ​ Lyra is a prominent constellation, largely due to Vega which forms one of its corners, and is one of the brightest stars in the sky. Interestingly, Vega is defined as the zero point of the magnitude scale - a logarithmic system used to measure the brightness of celestial objects. Technically, the brightness of all stars and galaxies are measured relative to Vega! ​ Have you ever wondered why meteor showers occur exactly one year apart and why they always radiate from the same defined point in the sky? The answer lies in the Earth's orbit around the Sun, which takes 365 days. During this time, Earth may encounter streams of debris left by a comet, composed of gas and dust particles that are released when an icy comet approaches the Sun and vaporizes. As the debris particles enter Earth’s atmosphere, they burn up due to friction, creating a streak of light known as a meteor. Meteorites are fragments that make it through the atmosphere to the ground. ​ The reason that the Lyrids meteor shower peaks in mid-late April each year is that the Earth encounters the same debris stream at the point on its orbit corresponding to mid-late April. Comets and their debris trails have very eccentric, but predictable orbits, and the Earth passes through the trail of Comet Thatcher in mid-late April every year. Additionally, Earth’s orbit intersects the trail at approximately the same angle every year, and from the perspective of an observer on Earth, the constellation Lyra most accurately matches up with the radiant point of the meteors when they are mapped onto the canvas of background stars in the night sky. ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ The Lyrids meteor shower peaks in mid-late April each year. Image/ EarthSky.org This year, there is a fortunate alignment of celestial events. New Moon occurs on April 20th, meaning that by the time the Lyrids reach their maximum intensity, the Moon is only 6% illuminated, resulting in darker skies and an increased chance to see this dazzling display. ​ Written by Joseph Brennan

  • AI: the next step in diagnosis and treatment of genetic diseases | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link AI: the next step in diagnosis and treatment of genetic diseases AI can process data sets and identify patterns and biomarkers With the development of more intricate Artificial Intelligence (AI) software, which has rapidly grown from the chaotic chatbots to the more well-formed ChatGPT, it is easy to think we are seeing the rise of powerful artificial intelligence that could potentially replace us all. However, there is one problem. Originality does not exist for AI, at least not complete originality. At its most basic, an AI program is trained on a set of data, whether this be an entire search engine’s worth of data, as is the case for ChatGPT, or a few images and phrases gathered from the internet. Therefore, an AI does not know any more than what it can quote or infer from the provided data, which means that a piece of art, a picture of a family, or any short story AI is asked to produce is often a replica of techniques or a chaotic and terrifying mess of images it has been given to use. However, here also lies its strength. AI can take in thousands of images and data sets and notice minor changes and differences the average person could not. It is, therefore, not AI’s ability to create the unique, but instead its ability to recognise the mundane that we can utilise, even in diagnosing and treating genetic disorders. Diagnosis By analysing PET, MRI, fMRI and genetic data, AI can process enormous data sets and identify subtle patterns and biomarkers that often elude human observations, enabling earlier and more precise diagnosis. When looking at examples of the application of AI in the diagnosis of genetic disorders, a good reference is the so-far successful use of AI in diagnosing Huntington’s disease. Huntington’s disease diagnosis using AI Huntington’s disease symptoms present as patients experience involuntary movements and a decline in decision-making processes. Huntington's disease is a genetic disorder, meaning it is caused by a faulty gene, in this case, a fault in the Huntingtin gene (Htt). The Huntington’s disease mutation in Htt results from CAG trinucleotide repeats, a highly polymorphic expansion of Htt consisting of the CAG (cytosine, adenine, guanine) nucleotides (DNA building blocks). Whilst CAG repeats are common and often normal and unharmful, individuals with Huntington’s disease possess an abnormally high number of these CAG repeats (more than 36). When an individual has an abnormally high number of CAG repeats, their Htt proteins do not fold into their proper shape, causing them to bond with other proteins and become toxic to a cell, which ultimately causes cell death in crucial medium spiny neurons (MSN) in the basal ganglia. Basal ganglia are brain structures responsible for the fine-tuning of our motor processes, which they do by essentially allowing neurons to respond in a preferred direction (a target muscle) rather than a null direction using MSNs. So, it is clear how Huntington's disease symptoms occur; mutant Htt leads to cell death in MSNs, leading to the basal ganglia’s inability to control movement, which causes characteristic involuntary behaviours, among other symptoms. Because we identified these changes in Htt and loss of MSN in the basal ganglia, PET, MRI, and fMRI scans are often used in the diagnosis of Huntington’s disease, in addition to genetic and mobility tests. By collecting and extracting clinical and genetic data, certain AI algorithms can analyse the broad range of Huntington’s disease clinical manifestations, identify differences, including even minute changes in the basal ganglia that a doctor may not have, and make an earlier diagnosis. One branch of AI that has proved effective is machine learning. Machine learning models in diagnosis Machine learning uses data and algorithms to imitate the way humans learn. For Huntington's disease diagnosis, this involves the identification of biomarkers and patterns in medical images, gene studies and mobility tests, and detecting subtle changes between data sets, distinguishing Huntington’s disease patients from healthy controls. While machine learning in Huntington’s disease diagnosis comes in many forms, the decision tree model, where the AI uses a decision tree as illustrated in the Project Gallery, has proven very effective. A decision tree model looks at decisions and their possible consequences and breaks them into subsets branching downward, going from decision to effect. Recent research using AI in Huntington’s disease diagnosis has utilised this model to analyse gait dynamics data. This data looks at variation in stride length, how unsteady a person is while walking, and the degree to which one stride interval (the time between strides) differs from any previous and any subsequent strides. For an individual, it is widely accepted that if they have abnormal variations in stride (their walking speed is reduced, their stance is widened), then they are exhibiting symptoms of Huntington’s disease. Therefore, by using this gait data, and having the machine learning model come up with a mean value for stride variation for trial patients, it will be able to discern which patients have stride variation associated with Huntington’s disease (a higher variation in stride) and those that do not. Researchers found that using this method of diagnosis, they were able to accurately identify which gaits belonged to Huntington's disease patients, with an accuracy of up to 100%. Furthermore, researchers also found decision tree models useful when identifying whether a gene links with Huntington’s disease when comparing patients' genetic information with prefrontal cortex samples, with this method’s accuracy being 90.79%. With these results and even more models showing incredible promise, AI is already proving itself useful when it comes to identifying and diagnosing sufferers of genetic disorders, such as those with Huntington’s disease. But this leads us to ask, can AI even help in the treatment of those suffering from genetic disorders? Treatment- current studies in cystic fibrosis While AI models can be applied diagnostically for disorders such as Huntington's disease, they may also be relied upon in disease treatment. The use of AI in tailored treatment is the focus of current research, with one even looking at improving the lives of those suffering from cystic fibrosis. Around 10,800 people are recorded as having cystic fibrosis in the UK, and this debilitating disorder results in a buildup of thick mucus, leading to persistent infections and other organ complications. The most common cause of cystic fibrosis is a mutation in the gene coding for the protein CFTR, resulting from a deletion in its coding gene, causing improper folding in the protein CFTR, as we saw in Huntington’s disease. This misfolding leads to its retention in the wrong place in a cell, so it can no longer maintain a balance of salt and water on body surfaces. Because of the complex symptoms arising from this imbalance, this disease is very difficult to manage, but there is hope, and hope comes as SmartCare. SmartCare involved home monitoring and followed 150 people with cystic fibrosis for six months, having them monitor their lung function, pulse, oxygen saturation and general wellness and upload recorded data to an app. Subsequently, researchers at the University of Cambridge used machine learning to create a predictive algorithm that used this lung, pulse, and oxygen saturation data, identifying patterns that were associated with a decline in a patient's condition, and then predicted this decline much faster than the patient of their doctor could. On average, this model could predict a decline in patient condition 11 days earlier than when the patient would typically start antibiotics, allowing health providers to respond quicker and patients to feel less restricted by their health. This project was, in fact, so successful that the US CF Foundation is now supporting a clinical implementation study, called Breath, which began in 2019 and continues to this day. Although there is a long way to go, using AI, the future can seem brighter. In Huntington’s disease and cystic fibrosis, we can see its effectiveness in both disease diagnosis and treatment. With the usage of AI predicted to increase in the future, there is a great outlook for patients and an opportunity for greater quality of care. This ultimately could ease patient suffering and prevent patient deaths. All this positive research tells us AI is our friend (although science fiction would often persuade us otherwise), and it will guide us through the tricky diagnosis and treatment of our most challenging diseases, even those engrained in our DNA. Written by Faye Boswell Related article: AI in drug discovery Project Gallery

  • The Challenges in Modern Day Chemistry | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The Challenges in Modern Day Chemistry And can we overcome them? Chemistry, heralded as the linchpin of the natural sciences, serves as the veritable bedrock of our comprehension of the world and concurrently takes on a pivotal role in resolving the multifaceted global challenges that confront humanity. In the context of the modern era, chemistry has undergone a prodigious transformation, with research luminaries persistently challenging the fringes of knowledge and technological application. However, this remarkable trajectory is shadowed by a constellation of intricately interwoven challenges that mandate innovative and often paradigm-shifting solutions. This article embarks on a comprehensive exploration of the salient and formidable challenges that presently beset the discipline of contemporary chemistry. Sustainability and the Imperative of Green Chemistry The paramount challenge confronting modern chemistry pertains to the burgeoning and compelling imperative of environmental sustainability. The chemical industry stands as a colossal contributor to ecological degradation and the inexorable depletion of vital resources. Consequently, an exigent necessity looms: the development of greener and environmentally benign chemical processes. Green chemistry, an avant-garde discipline, is at the vanguard of this transformation, placing paramount emphasis on the architectural design of processes and products that eschew the deployment of hazardous substrates. Researchers within this sphere are diligently exploring alternative, non-toxic materials and propounding energy-efficient methodologies, thereby diminishing the ecological footprint intrinsic to chemical procedures. Energy Storage and Conversion at the Frontier In an epoch marked by the surging clamour for renewable energy sources such as photovoltaic solar panels and wind turbines, the exigency of efficacious energy storage and conversion technologies attains unparalleled urgency. Chemistry assumes a seminal role in the realm of advanced batteries, fuel cells, and supercapacitors. However, extant challenges such as augmenting energy density, fortifying durability, and prudently attenuating production costs remain obstinate puzzles to unravel. In response, a phalanx of researchers is actively engaged in the relentless pursuit of novel materials and the innovative engineering of electrochemical processes to surmount these complexities. Drug Resistance as a Crescendoing Predicament The advent of antibiotic-resistant bacterial strains and the irksome conundrum of drug resistance across diverse therapeutic spectra constitute a formidable quandary within the precincts of medicinal chemistry. With pathogenic entities continually evolving, scientists face the Herculean task of continually conceiving novel antibiotics and antiviral agents. Moreover, the unfolding panorama of personalised medicine and the realm of targeted therapies necessitate groundbreaking paradigms in drug design and precision drug delivery systems. The tantalising confluence of circumventing drug resistance whilst simultaneously obviating deleterious side effects represents a quintessential challenge in the crucible of contemporary chemistry. Ethical Conundrums and the Regulatory Labyrinth As chemistry forges ahead on its unceasing march of progress, ethical and regulatory conundrums burgeon in complexity and profundity. Intellectual property rights, the ethical contours of responsible innovation, and the looming spectre of potential malevolent misuse of chemical knowledge demand perspicacious contemplation and meticulously crafted ethical architectures. Striking an intricate and nuanced equilibrium between the imperatives of scientific advancement and the obligations of prudent stewardship of chemical discoveries constellates an enduring challenge that impels the chemistry community to unfurl its ethical and regulatory sails with sagacity and acumen. In conclusion... Modern-day chemistry, ensconced in its dynamic and perpetually evolving tapestry, stands as the lodestar of innovation across myriad industries while confronting multifarious global challenges. However, it does so against the backdrop of its own set of formidable hurdles, ranging from the exigencies of environmental responsibility to the mysteries of drug resistance and the intricate tangle of ethical and regulatory dilemmas. The successful surmounting of these multifaceted challenges mandates interdisciplinary collaboration, imaginative innovation, and an unwavering commitment to the prudential and ethically-conscious stewardship of the profound knowledge and transformative potential that contemporary chemistry affords. As humanity continues its inexorable march towards an ever-expanding understanding of the chemical cosmos, addressing these challenges is the sine qua non for an enduringly sustainable and prosperous future. Written by Navnidhi Sharma Related article: Green Chemistry Project Gallery

  • Allergies | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Allergies Deconstructing allergies: mechanisms, treatments, and prevention Modern populations have witnessed a dramatic surge in the number of people grappling with allergies, a condition that can lead to a myriad of health issues such as eczema, asthma, hives, and, in severe cases, anaphylaxis. For those who are allergic, these substances can trigger life-threatening reactions due to their abnormal immune response. Common allergens include antibiotics like penicillin, as well as animals, insects, dust, and various foods. The need for strict dietary restrictions and the constant fear of accidental encounters with allergens often plague patients and their families. Negligent business practices and mislabelled food have even led to multiple reported deaths, underscoring the gravity of allergies and their alarming rise in prevalence. The primary reason for the global increase in allergies is believed to be the lack of exposure to microorganisms during early childhood. The human microbiome, a collection of microorganisms that live in and on our bodies, is a key player in our immune system. The rise in sanitation practices is thought to reduce the diversity of the microbiome, potentially affecting immune function. This lack of exposure to infections may cause the immune system to overreact to normally harmless substances like allergens. Furthermore, there is speculation about the impact of vitamin D deficiency, which is becoming more common due to increased indoor time. Vitamin D is known to support a healthy immune response, and its deficiency could worsen allergic reactions. Immune response Allergic responses occur when specific proteins within an allergen are encountered, triggering an immune response that is typically used to fight infections. The allergen's proteins bind to complementary antigens on macrophage cells, causing these cells to engulf the foreign substance. Peptide fragments from the allergen are then presented on the cell surface via major histocompatibility complexes (MHCs), activating receptors on T helper cells. These activated T cells stimulate B cells to produce immunoglobulin E (IgE) antibodies against the allergen. This sensitizes the immune system to the allergen, making the individual hypersensitive. Upon re-exposure to the allergen, IgE antibodies bind to allergen peptides, activating receptors on mast cells and triggering the release of histamines into the bloodstream. Histamines cause vasodilation and increase vascular permeability, leading to inflammation and erythema. In milder cases, patients may experience itching, hives, and runny nose; however, in severe allergic reactions, intense swelling can cause airway constriction, potentially leading to respiratory compromise or even cessation. At this critical point, conventional antihistamine therapy may not be enough, necessitating the immediate use of an EpiPen to alleviate symptoms and prevent further deterioration. EpiPens administer a dose of epinephrine, also known as adrenaline, directly into the bloodstream when an individual experiences anaphylactic shock. Anaphylactic shock is typically characterised by breathing difficulties. The primary function of the EpiPen is to relax the muscles in the airway, facilitating easier breathing. Additionally, they counteract the decrease in blood pressure associated with anaphylaxis by narrowing the blood vessels, which helps prevent symptoms such as weakness or fainting. EpiPens are the primary treatment for severe allergic reactions leading to anaphylaxis and have been proven effective. However, the reliance on EpiPens underscores the necessity for additional preventative measures for individuals with allergies before a reaction occurs. Preventative treatment Young individuals may have a genetic predisposition to developing allergies, a condition referred to as atopy. Many atopic individuals develop multiple hypersensitivities during childhood, but some may outgrow these allergies by adulthood. However, for high-risk atopic children, preventive measures may offer a promising solution to reduce the risk of developing severe allergies. Clinical trials conducted on atopic infants explored the concept of immunotherapy treatments, involving continuous exposure to small doses of peanut allergens to prevent the onset of a full-blown allergy. Initially, skin prick tests for peanut allergens were performed, and only children exhibiting negative or mild reactions were selected for the trial. Those with severe reactions were excluded due to the high risk of anaphylactic shock with continued exposure. The remaining participants were randomly assigned to either consume peanuts or follow a peanut-free diet. Monitoring these infants as they aged revealed that continuous exposure to peanuts reduced the prevalence of peanut allergies by the age of 5. Specifically, only 3% of atopic children exposed to peanuts developed an allergy compared to 17% of those in the peanut-free group. The rise in severe allergies poses a growing concern for global health. Once an atopic individual develops an allergy, mitigating their hypersensitivity can be challenging. Current approaches often involve waiting for children to outgrow their allergies, overlooking the ongoing challenges faced by adults who remain highly sensitive to allergens. Implementing preventive measures, such as early exposure through immunotherapy, could enhance the quality of life for future generations and prevent sudden deaths in at-risk individuals. In conclusion, a dramatic surge in the prevalence of allergies in modern populations requires more attention from researchers and health care providers. Living with allergies can bring many complexities into someone’s life even before they potentially have a serious reaction. Currently treatments are focused on post-reaction emergency care, however preventative strategies are still a pressing need. With cases of allergies predicted to rise further, research into this global health issue will become increasingly important. There are already promising results from early trials of immunotherapy treatments, and with further research and implementation these treatments could improve the quality of life of future generations. Written by Charlotte Jones Project Gallery

  • Antisense oligonucleotide gene therapy for treating Huntington's disease | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Antisense oligonucleotide gene therapy for treating Huntington's disease A potential gene therapy Huntington’s disease (HD) is an inherited neurodegenerative disease caused by a CAG extension in exon 1 of the huntingtin gene. An extended polyglutamine tract in the huntingtin protein is developed due to the expanded alleles, resulting in intracellular signalling defects. Antisense Oligonucleotide (ASO) gene therapy is currently being pioneered to treat HD. In this therapy, oligonucleotides are inserted into cells and bind to the target huntingtin mRNA. Thus, inhibiting the formation of the huntingtin protein by either physically blocking the translation of mRNA (figure 1) or by utilising RNase H to degrade the mRNA. Previous ASO gene therapy experiments conducted on R6/2 mice that express the human huntingtin gene have been successful. In HD research, the R6/2 mouse model is commonly used to replicate HD symptoms and is therefore useful for testing potential treatments. The transgenic R6/2 mouse has an N-terminally mutant Huntingtin gene with a CAG repeat expansion within exon 1. In this successful experiment, scientists treated one group of R6/2 mice with the ASO treatment that suppresses the production of human huntingtin mRNA, and saline solution was administered to the control group of mice. This experiment aimed to confirm if ASO therapy improves the survival rate in the R6/2 mice. The results showed that human huntingtin mRNA levels of the mice treated with ASO therapy were lower than the control group. Furthermore, the mice treated with ASO therapy had a higher percentage of survival and lived longer (21 weeks), in comparison to the control group mice that survived until 19 weeks. Thus, it could be concluded that if less human huntingtin mRNA was present in the ASO group, then less human huntingtin mRNA would be translated, and so there would be less synthesis of the huntingtin protein, in contrast to the control group. The results of this study are enormously informative in understanding how gene therapy can be used in the future to treat other neurological diseases. However, before ASO therapy is approved for clinical use, further trials will need to be conducted in humans to verify the same successful outcomes as the R6/2 mice. If approved, then the symptoms of HD, including dystonia could be safely controlled with ASO therapy. Furthermore, scientists need to consider that an increased survival rate of only an additional two weeks, as shown in the experiment does not always correlate to an increased quality of life for the patient. Therefore, it needs to be established if the benefits of ASO gene therapy will outweigh the risks associated with it. Furthermore, the drug PBT2, which influences copper interactions between abnormal proteins, is currently being studied as a potential treatment option for HD. Some studies have inferred that the aggregation of mutant huntingtin proteins could be due to interactions with metals, including copper. Therefore, this drug is designed to chelate metals and consequently, decrease abnormal protein aggregations in the body. This treatment has been shown to improve motor tasks and increase the lifespan in R6/2 mice. However, as this treatment has a lot of shortcomings, further studies need to be conducted over a large period of time to confirm a successful outcome of this drug on HD patients. Written by Maria Z Kahloon References: Kordasiewicz HB, Stanek LM, Wancewicz EV, Mazur C, McAlonis MM, Pytel KA, et al. Sustained therapeutic reversal of Huntington’s disease by transient repression of huntingtin synthesis. Neuron. 2012;74(6):1031–44. Valcárcel-Ocete L, Alkorta-Aranburu G, Iriondo M, Fullaondo A, García-Barcina M, Fernández-García JM, et al. Exploring genetic factors involved in Huntington disease age of onset: E2F2 as a new potential modifier gene. PLoS One. 2015;10(7):e0131573. Liou S. Antisense gene therapy [Internet]. Stanford.edu . 2010 [cited 2021 Aug 6]. Available from: https://hopes.stanford.edu/antisense-gene-therapy/ Huntington's disease research study in R6/2 MOUSE model: Charles River [Internet]. Charles River Labs. [cited 2021 Aug 26]. Available from: https://www.criver.com/products-services/discovery-services/pharmacology-studies/neuroscience-models-assays/huntingtons-disease-studies/r62-mouse?region=3696 Frank S. Treatment of Huntington's disease. Neurotherapeutics : the journal of the American Society for Experimental NeuroTherapeutics. Springer US; 2014;11(1):153-160. Potkin KT, Potkin SG. New directions in therapeutics for HUNTINGTON DISEASE. Future neurology. 2018;13(2):101-121. Project Gallery

  • Conservation of marine iguanas | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Conservation of marine iguanas They are on the IUCN red list as 'vulnerable' The marine iguana ( Amblyrhynchus cristatus ), also known as the sea iguana, is a unique species. It is the world’s only ocean- going lizard. Their main food source is algae; large males can dive to forage for this source, while females feed during low tide. They can be found on rocky shorelines, but also on marshes, mangrove swamps and beaches of the Galapagos. Their range is limited to the Galapagos islands, so they are an isolated species. Currently, they are on the IUCN red list as ‘vulnerable’ with a current population estimated at 200,000, and conservation efforts are needed to stabilise populations. Key threats There are three key threats to iguana populations. The first is invasive species; animals such as pigs, dogs and cats feed on young hatchlings and iguana eggs, which reduces the long-term survival rate of the species. Marine iguanas have not yet developed defence strategies against these predators. Even humans introduce pathogens to the islands that pose a threat to the species, because of their isolated habitat, the marine iguana lacks immunity to many pathogens and so has a higher risk of contracting diseases. Climate change is another key threat. El Niño is a weather event that prevents cold, nutrient-rich waters, that the marine wildlife depends on, from reaching the Eastern Tropical Pacific. This depletes algae populations, and this food drop drastically reduces iguana populations ( Figure 1 ). With global warming, El Niño events are expected to be more prominent and more frequent. In addition, pollution from humans like oil spills and microplastics are damaging their habitat. Current and future conservation methods Under the laws of Ecuador, marine iguanas are completely protected. Their land range is in the Galapagos National Park, and their sea range is within the Galapagos Marine Reserve. They are also listed on the CITES, which ensures monitoring the trade of endangered animals to inhibit damage to their numbers. Sanctuaries are also in place to mitigate against extinction, but their specialised diet is challenging. So, what does the future hold for marine iguanas? The biggest challenge is the distribution of the species. The population is scattered across the different islands of the Galapagos as such, there are at least 11 subspecies. This brings more complications to marine iguana conservation. As these subspecies specialise, it becomes less likely they will breed, thus more difficult to maintain the species population. Introducing education and awareness programmes will better equip us to the dangers faced by marine iguanas and could be a tourism idea for the Galapagos. This species is one of a kind, which is why it is so important for them to be protected.There should be a monitoring scheme, as suggested by MacLeod and Steinfartz, 2016 ( Figure 2 ), but the location of these subspecies makes it difficult to monitor them. However, there was a recent study using drone-based methods which showed promising results ( Figure 3 ). The overarching question remains: do we continue to conserve the current population in the Galapagos, or should we relocate the species to a less endangered habitat. Written by Antonio Rodrigues Related articles: Conservation of Galapagos Tortoises / 55 years of vicuna conservation Project Gallery

  • Exploring Ibuprofen | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Exploring Ibuprofen Its toxicodynamics, and balancing benefits and risks What is Ibuprofen? Ibuprofen is a standard over-the-counter medicine which can be bought from supermarkets and pharmacies. It is primarily used for pain relief, such as back pain, period pain, toothaches, etc. It can also be used for arthritis pain and inflammation. It is available in various forms, including tablets, capsules, gels, and sprays for the skin. The Toxicodynamics of Ibuprofen Toxicodynamics refers to the biological effects of a substance after exposure to it. Scientists look at the mechanisms by which the substance produces toxic effects and the target organs or tissues it affects. Ibuprofen works by stopping the enzymes that synthesise prostaglandins, which are a group of lipid molecules that cause inflammation, including symptoms like redness, heat, swelling and pain. Therefore, after the action of Ibuprofen, inflammatory responses and pain are reduced. Ibuprofen targets organs and tissues, including the gastrointestinal tract, the kidneys, the central nervous system, blood and more. Balancing the Benefits and Risks Ibuprofen’s method of action means it is a safe and effective pain relief medication for most people. It is also easily accessible and easy to use. However, it is able to affect the target organs and tissues negatively and, therefore, can have serious side effects, especially if taken for an extended period of time and/or in high doses. They include heartburn, abdominal pain, kidney damage (especially for people who already have kidney problems), low blood count and more. Therefore, it is important to use Ibuprofen responsibly. This can be done by understanding and being well-informed about its effects on the body, particularly its impact on organs and tissues. With caution and proper use, the side effects can be minimised. One of the easiest ways to lessen side effects is by taking the medication with food. Additionally, patients should take the lowest effective dose for the shortest possible time. If patients have a history of stomach problems, avoiding Ibuprofen and using alternatives is the best solution. Patients can also talk to their GP if they are concerned about the side effects and report any suspected side effects using the Yellow Card safety scheme on the NHS website. Links to find out more: https://www.nhs.uk/medicines/ibuprofen-for-adults/about-ibuprofen-for-adults/ https://www.sciencedirect.com/topics/pharmacology-toxicology-and-pharmaceutical-science/toxicodynamics https://www.chm.bris.ac.uk/motm/ibuprofen/ibuprofenh.htm https://www.ncbi.nlm.nih.gov/books/NBK526078/ Written by Naoshin Haque Project Gallery

  • Chirality in drugs | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Chirality in drugs Why chirality is important in developing drugs Nearly 90% of the drugs currently on the market are racemates, which are composed of an equimolar mixture of two enantiomers, and approximately half of all drugs are chiral compounds. Chirality is the quality of an item that prevents it from superimposing on its mirror counterpart, similar to left and right hands. Chirality, a generic characteristic of "handedness,"plays a significant role in the creation of several pharmaceutical drugs. It's interesting to note that 20 of the 35 drugs the Food and Drug Administration (FDA) authorised in 2020 are chiral drugs. For example, Ibuprofen, a chiral 2-aryl propionic acid derivative, is a common over-the-counter analgesic, antipyretic, and anti-inflammatory medication. However, Ibuprofen and other medications from similar families can have side effects and risks related to their usage. Drugs of the chiral class have the drawback that only one of the two enantiomers may be active, while the other may be ineffective or have some negative effects. The inactive enantiomer can occasionally interact with the active enantiomer, lowering its potency or producing undesirable side effects. Additionally, Ibuprofen and other members of the chiral family of pharmaceuticals can interact with other drugs, including over-the-counter and prescription ones. To guarantee that only the active enantiomer is present in chiral-class medications, it is crucial for pharmaceutical companies to closely monitor their production and distribution processes. Lessening the toxicity or adverse effects linked to the inactive enantiomer, medical chemistry has recently seen an increase in the use of enantiomerically pure drugs. In any instance, the choice of whether to utilise a single enantiomer or a combination of enantiomers of a certain medicine should be based on clinical trial results and clinical competence. In addition to requests to determine and control the enantiomeric purity of the enantiomers from a racemic mixture, the use of single enantiomer drugs may result in simpler and more selective pharmacological profiles, improved therapeutic indices, simpler pharmacokinetics, and fewer drug interactions. Although, there have been instances where the wrong enantiomer results in unintended side effects, many medications are still used today as racemates with their associated side effects; this issue is probably brought on by both the difficulty of the chiral separation technique and the high cost of production. In conclusion, Ibuprofen and other medications in the chiral family, including those used to treat pain and inflammation, can be useful, but they also include a number of dangers and adverse effects. It's critical to follow a doctor's instructions when using these medications and to be aware of any possible interactions, allergic reactions, and other hazards. To maintain the security and efficacy of medicines in the chiral class, pharma producers also have a duty to closely monitor their creation and distribution. By Navnidhi Sharma Project Gallery

  • Schizophrenia, Inflammation and Accelerated Aging: a Complex Medical Phenotype | Scientia News

    Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Schizophrenia, Inflammation and Accelerated Aging: a Complex Medical Phenotype Setting Neuropsychiatry In a Wider Medical Context In novel research by Campeau et al. (2022), the proteomic analysis of 742 proteins from the blood plasma of 54 schizophrenic participants and 51 age-matched healthy volunteers. This investigation resulted in the validation of the previously-contentious link between premature aging and schizophrenia by testing for a wide variation of proteins involved in cognitive decline, aging-related comorbidities, and biomarkers of earlier-than-average mortality. The results from this research demonstrated that age-linked changes in protein abundance occur earlier on in life in people with schizophrenia. This data also helps to explain the heightened incidence rate of age-related disorders and early all-cause death in schizophrenic people too, with protein imbalances associated with both phenomena being present in all schizophrenic age strata over age 20. This research is the result of years of medical intrigue regarding the biomedical underpinnings of schizophrenia. The comorbidities and earlier death associated with schizophrenia were focal points of research for many years, but only now have valid explanations been posed to answer the question of the presence of such phenomena. The explanation for the greater incidence rate of early death in schizophrenia was described in this study as the increased volume of certain proteins. Specifically, these included biomarkers of heart disease (Cystatin-3, Vitronectin), blood clotting abnormalities (Fibrinogen-B) and an inflammatory marker (L-Plastin). These proteins were tested for due to their inclusion in a dataset of protein biomarkers of early all-cause mortality in healthy and mentally-ill people published by Ho et al. (2018) for the Journal of the American Heart Association. Furthermore, a protein linked to degenerative cognitive deficit with age, Cystatin C, was present in increased volume in schizophrenic participants both under and over the age of 40. This explains why antipsychotics have limited effectiveness in reducing the cognitive effects of schizophrenia. In this study, schizophrenics under 40 had similar plasma protein content as the healthy over-60 strata set, including both biomarkers of cognitive decline, age-related diseases and death. Schizophrenics under-40 showed the same likelihood for incidence of the latter phenomena compared to the healthy over-60 set. These results could demonstrate the necessity for use of medications often used to treat age-related cognitive decline and mortality-linked protein abundances to treat schizophrenia. One of these options include polyethylene glycol-Cp40, a C3 inhibitor used to treat nocturnal haemoglobinuria, which could be used to ameliorate the risk of developing age-related comorbidities in schizophrenic patients. This treatment may be effective in the reduction of C3 activation, which would reduce the opsonisation (tagging of detected foreign products in blood). When overexpressed, C3 can cause the opsonisation of healthy blood cells in a process called haemolysis, which can catalyse the reduction of blood volume implicated in cardiac events and other comorbidities. However, whether or not this treatment would benefit those with schizophrenia is yet to be proven. The potential of this research to catalyse new treatment options for schizophrenia cannot be understated. Since the publication of Kilbourne et al. in 2009, the impact of cardiac comorbidities in catalysing early death in schizophrenic patients has been accepted medical dogma. The discovery of exact protein targets to reduce the incidence rate of age-linked conditions and early death in schizophrenia will allow the condition to be treated more holistically, with greater observance to the fact that schizophrenia is not only a psychiatric illness, but also a neurocognitive disorder with affiliated comorbidities that have to be prevented adequately. By Aimee Wilson Project Gallery

bottom of page