Search Index
268 items found
- Biochemistry of cancer: integrins, the desirable targets | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Biochemistry of cancer: integrins, the desirable targets 24/09/24, 10:50 Integrins are desirable to target cancer Every year, 8 million people worldwide pass away from cancer, and this number is expected to rise. Cancer can damage a wide range of organs in people of various ages. It is quite honest to say that Cancer is the most common and severe problem in clinical medicine. Cancer's fundamental problems shed light on the biochemical and genetic processes underlying the unchecked expansion of cancer cells. The extracellular matrix (ECM)'s biochemical and biomechanical properties affect how sensitive cells are. Cell health depends on different reactions, such as proliferation, apoptosis, migration, and differentiation. The tumour microenvironment also largely influences cancer metastasis, medication resistance, and recurrence. Transmembrane glycoproteins called integrins mediate connections between cells and the ECM and connect it to the cytoskeleton. They relay the information from the ECM through downstream signalling pathways and can hence control the properties of the cell. Mammals have so far been found to contain 24 different integrin heterodimers, formed by combining 18 α- and 8 β-subunits. A cell's ability to bind to specific ECM elements depends on the pattern of integrin expression, which also affects how a cell recognises and reacts to its surroundings. These same integrin-mediated pathways are used by tumour cells in the context of cancer to boost invasiveness and oncogenic survival as well as to create a host milieu that supports tumour development and metastatic dissemination (Figure 1). Hence, Integrins are interesting targets for cancer therapy due to their role in tumour progression, and several integrin antagonists, including antibodies and synthetic peptides, have been successfully used in clinics for cancer therapy. Unligated integrins may have a detrimental effect on tumour survival. They are generally unligated in adherent cells, which leads to the cleavage of caspase 8, which in turn causes tumour cells to undergo apoptosis through a process known as integrin-mediated death (IMD) (Figure 2). Integrins' precise chemical signals and the mechanical environment of the ECM control how cancer cells behave. A key role is also played by the ECM's physicochemical environment. Chemically altered substrate surfaces have been used to study this interaction, but topology and functionality control are still difficult to achieve. Modifying a cell's local chemical environment does offer a viable method for selectively controlling the behaviour of cancer cells. Together, targeted external cue presentation has the potential to enhance existing intracellular cancer therapy approaches. When combined with other targeted therapies (tyrosine kinase inhibitors, anti-growth factor antibodies) for anticancer treatment, integrin inhibition may be used as a potential target for drug development. However, it needs to be thoroughly evaluated in the pre-clinical phase, possibly taking into account all of the plausible escape mechanisms by which tumour cells can develop. By Navnidhi Sharma Related articles: Why whales don't get cancer / Breast cancer and asbestos / MOFs in cancer drug delivery / Anti-cancer metal compounds REFERENCES Hamidi, H., Pietilä, M., & Ivaska, J. (2016). The complexity of integrins in cancer and new scopes for therapeutic targeting. British Journal of Cancer, 115(9), 1017–1023. https://doi.org/10.1038/bjc.2016.312 Jacob, M., Varghese, J., Murray, R. K., & Weil, P. A. (2016). Cancer: An Overview (V. W. Rodwell, D. A. Bender, K. M. Botham, P. J. Kennelly, & P. A. Weil, Eds.). Access Medicine; McGraw-Hill Education. https://accessmedicine.mhmedical.com/content.aspx?bookid=1366§ionid=73247495 Li, M., Wang, Y., Li, M., Wu, X., Setrerrahmane, S., & Xu, H. (2021). Integrins as attractive targets for cancer therapeutics. Acta Pharmaceutica Sinica B. https://doi.org/10.1016/j.apsb.2021.01.004 Yoshii, T., Geng, Y., Peyton, S., Mercurio, A. M., & Rotello, V. M. (2016). Biochemical and biomechanical drivers of cancer cell metastasis, drug response and nanomedicine. Drug Discovery Today, 21(9), 1489–1494. https://doi.org/10.1016/j.drudis.2016.05.011 Zhao, H., F. Patrick Ross, & Teitelbaum, S. L. (2005). Unoccupied αvβ3Integrin Regulates Osteoclast Apoptosis by Transmitting a Positive Death Signal. Molecular Endocrinology, 19(3), 771–780. https://doi.org/10.1210/me.2004-0161 Project Gallery
- Beyond the bump: unravelling traumatic brain injuries | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Beyond the bump: unravelling traumatic brain injuries 15/10/24, 11:34 The yearly incidence of TBI is around 27 and 69 million people worldwide A traumatic brain injury (TBI) is one of the most serious and complex injuries sustained by the human body, often with profound and long-term effects on an individual’s physical, emotional, behavioural and cognitive abilities. What is a traumatic brain injury? A TBI results from an external force which causes structural and physical damage to the brain. The primary injury refers to the immediate damage to the brain tissue which is caused directly by the event. Whereas secondary injuries result from the cascade of cellular and molecular processes triggered by the initial injury and develop from hours to weeks following the initial TBI. Typically, the injury can be penetrating, where an object pierces the skull and damages the brain, or non-penetrating which occurs when the external force is large enough to shake the brain within the skull causing coup- contrecoup damage. Diagnosis and severity The severity of a TBI is classified as either mild (aka concussion), moderate, or severe, using a variety of indices. Whilst more than 75% of TBIs are mild, even these individuals can suffer long-term consequences from post-concussion syndrome. Here are two commonly used measures to initially classify severity: The Glasgow Coma Scale (GCS) is an initial neurological examination which assesses severity based on the patient’s ability to open their eyes, move, and respond verbally. It is a strong indicator of whether an injury is mild (GCS 13-15), moderate (GCS 9-12) or severe (≤8). Following the injury and any period of unconsciousness, when a patient has trouble with their memory and is confused, they are said to have post-traumatic amnesia (PTA). This is another measure of injury severity and lasts up to 30 30 minutes in mild TBI, between 30 minutes and 24 hours in moderate TBI, and over 24 hours in severe TBI. Imaging tests including CT scans and MRIs are used to detect brain bleeds, swelling or any other damage. These tests are essential upon arrival to the hospital, especially in moderate and severe cases to understand the full extent of the injury. Leading causes of TBI Common causes of TBI are a result of: Falls (most common in young children and older adults) Vehicle collisions (road traffic accidents- RTAs) Inter-personal violence Sports injuries Explosive blasts Interestingly, the rate of TBI is 1.5 times more common in men than women. General symptoms The symptoms and outcome of a TBI depend on the severity and location of the injury. They differ from person to person based on a range of factors which include pre-injury sociodemographic vulnerabilities including age, sex and level of education, as well as premorbid mental illnesses. There are also post-injury factors such as access to rehabilitation and psychosocial support which influence recovery. Due to this, nobody will have the same experience of a TBI, however there are some effects which are more common than others which are described: Mild TBI: Physical symptoms: headaches, dizziness, nausea, and blurred vision. Cognitive symptoms: confusion, trouble concentrating, difficulty with memory or disorientation. Emotional symptoms: mood swings, irritability, depression or anxiety. Moderate-to-severe TBI: Behavioural symptoms: aggression, personality change, disinhibition, impulsiveness. Cognitive symptoms: difficulties with attention and concentration, decision making, memory, executive dysfunction, information processing, motivation, language, reasoning, self-awareness. Physical symptoms: headaches, seizures, speech problems, fatigue, weakness or paralysis. Many of these symptoms are ‘hidden’ and can often impact functional outcomes for an individual, such as their capacity for employment and daily living (i.e., washing, cooking, cleaning etc.). The long-term effects of TBI can vary, with some returning to normal functioning. However, others might experience lifelong disabilities and require adjustments in their daily lives. For more information and support, there are some great resources on the Headway website, a leading charity which supports individuals after brain injury. Written by Alice Jayne Greenan Project Gallery
- Monkey see, monkey clone | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Monkey see, monkey clone 31/10/24, 11:37 A leap forward in primate research Chinese scientists have recently unlocked the secrets of cloning Rhesus monkeys offering new hope for medical breakthroughs. Introduction When we think of cloning, perhaps the first thing that comes to mind is Dolly the sheep, the first mammal ever cloned from an adult cell back in 1996. This groundbreaking achievement inspired a revolution leading to the successful cloning of other mammals such as cattles and pigs. However, cloning primates, especially Rhesus monkeys, has proven to be a significant challenge due to the low success rates and high embryonic losses during development. What is cloning? Cloning is the process of creating an identical genetic copy of an organism. In mammals, this is typically done through a technique called somatic cell nuclear transfer (SCNT). In SCNT, the nucleus (the compartment storing genetic material) from a cell of the animal to be cloned is transferred into an egg cell that has had its own nucleus removed. This hybrid egg cell then develops into an embryo which is implanted into a surrogate mother to grow into a new individual. Despite the success in cloning other mammals, cloning primates has proven to be a significant challenge. However, the potential benefits of cloning primates for medical research make it a worthwhile endeavour. The importance of cloning primates You might be wondering why being able to clone primates is so important. Well, primates like the Rhesus monkey are invaluable models for studying human diseases and create new therapies! The reason we can use them as disease models is because they share about 93% genetic identity and have very similar physiological characteristics with humans. For instance, Rhesus monkeys also experience a decline in their cognitive abilities as they age, and they lose important connections between brain cells in the part of the brain responsible for complex thinking, even when there's no severe brain damage. Moreover, Rhesus monkeys also develop the same kinds of brain changes that we see in people with Alzheimer's disease, such as the buildup of sticky proteins called amyloid-beta and tangled fibres of another protein called tau.These similarities make them excellent models for understanding how human diseases progress and for developing new treatments. So, by cloning these animals, researchers might be able to create monkeys with specific genetic changes that mimic human diseases even more closely. This could allow scientists to study these diseases in greater detail and develop more effective therapies. Cloning primates could give us a powerful tool to fight against some of the most challenging disorders that affect the human brain! A breakthrough in primate cloning Now, a group of scientists in China have made a breakthrough in primate cloning. They successfully cloned a Rhesus monkey using a novel technique called trophoblast replacement (TR).This innovative approach not only helps us better understand the complex process of cloning but also offers a promising way to improve the efficiency of primate cloning, bringing us one step closer to unlocking the full potential of this technology for medical research and beyond. The awry DNA methylation of cloned conkey embryos To understand why cloning monkeys is so challenging, Liao and colleagues (2024) took a closer look at the genetic material of embryos created in two different ways. They compared embryos made through a standard fertility treatment called intracytoplasmic sperm injection (ICSI) with those created via the cloning technique, SCNT. What they found was quite surprising! To make matters worse, the scientists also noticed that certain genes, known as imprinted genes, were not functioning properly in the SCNT embryos. Imprinted genes are a special group of genes that play a crucial role in embryo development. In a healthy embryo, only one copy of an imprinted gene (either from the mother or the father) is active, while the other copy is silenced. But in the cloned embryos, both copies were often incorrectly switched on or off. Here's the really concerning part: these genetic abnormalities were not just present in the early embryos but also in the placentas of the surrogate monkey mothers carrying the cloned offspring. This suggests that the issues arising from the cloning process start very early in development and continue to affect the pregnancy. Liao and colleagues suspect that the abnormal DNA methylation patterns might be responsible for the imprinted gene malfunction. It's like a game of genetic dominos – when one piece falls out of place, it can cause a whole cascade of problems down the line. Piecing together this complex genetic puzzle is crucial for understanding why primate cloning is so difficult and how we can improve its success in the future. By shedding light on the mysterious world of DNA methylation and imprinted genes, Liao and colleagues have brought us one step closer to unravelling the secrets behind monkey cloning. Digging deeper: what does the data reveal? Liao et al. (2024) discovered that nearly half of the cloned monkey foetuses died before day 60 of the gestation period, indicating developmental defects in the SCNT embryos during implantation. They also found that the DNA methylation level in SCNT blastocysts was 25% lower compared to those created through ICSI (30.0% vs. 39.6%). Furthermore, out of the 115 human imprinting genes they examined in both the embryos and placentas, four genes - THAP3, DNMT1, SIAH1, and RHOBTB3 - showed abnormal expression and loss of DNA methylation in SCNT embryos. These findings highlight the complex nature of the reprogramming process in SCNT and the importance of imprinted genes in embryonic development. By understanding these intricacies, scientists can develop targeted strategies to improve the efficiency of primate cloning. The power of trophoblast replacement To avoid the anomalies in SCNT placentas, the researchers developed a new method called TR. In this method, they transferred the inner cell mass (the part of the early embryo that develops into the baby) from an SCNT embryo into the hollow cavity of a normal embryo created through fertilisation, after removing its own inner cell mass. The idea behind this technique is to replace the abnormal placental cells in the SCNT embryo with healthy ones from the normal embryo. And it worked! Using this method, along with some additional treatments, Liao et al. (2024) successfully cloned a healthy male Rhesus monkey that has survived for over two years (FYI his name is Retro!). The ethics of cloning While the scientific advances in primate cloning are exciting, they also raise important ethical questions. Some people worry about the potential misuse of this technology, for instance to clone humans, which is widely considered unethical. Others are concerned about the well-being of cloned animals, as the cloning process can sometimes lead to health problems. As scientists continue to make progress in cloning technology, it is essential to have open discussions about the ethical implications of their work. Rules and guidelines must be put in place to ensure that this technology is developed and used responsibly, with the utmost care for animal welfare and the concerns of society. Looking to the future The successful cloning of a rhesus monkey using TR opens up new avenues for primate research. This technology can help scientists create genetically identical monkeys to study a wide range of human diseases, from neurodegenerative disorders like Alzheimer's and Parkinson's to infectious diseases like HIV and COVID-19. The trophoblast replacement technique developed by Liao et al. (2024) increases the likelihood of successful cloning by replacing the abnormal placental cells in the SCNT embryo with healthy ones from a normal embryo. However, it is important to note that this technique does not affect the genetic similarity between the clone and the original monkey, as the inner cell mass, which gives rise to the foetus, is still derived from the SCNT embryo. Moreover, this research provides valuable insights into the mechanisms of embryonic development and the role of imprinted genes in this process. By understanding these fundamental biological processes, scientists can not only improve the efficiency of cloning but also develop new strategies for regenerative medicine and tissue engineering. As we look to the future, cloning monkeys could help us make groundbreaking discoveries in medical research and develop new treatments for human diseases. However, we must also carefully consider the ethical implications of cloning primates and ensure that this powerful tool is used responsibly and for the benefit of society. Written by Irha Khalid REFERENCES Beckman, D. and Morrison, J.H. (2021). Towards developing a rhesus monkey model of early Alzheimer’s disease focusing on women’s health. American Journal of Primatology , [online] 83(11). doi: https://doi.org/10.1002/ajp.23289 . Liao, Z., Zhang, J., Sun, S., Li, Y., Xu, Y., Li, C., Cao, J., Nie, Y., Niu, Z., Liu, J., Lu, F., Liu, Z. and Sun, Q. (2024). Reprogramming mechanism dissection and trophoblast replacement application in monkey somatic cell nuclear transfer. Nature Communications , [online] 15(1), p.5. doi: https://doi.org/10.1038/s41467-023-43985-7 . Morrison, J.H. and Baxter, M.G. (2012). The ageing cortical synapse: hallmarks and implications for cognitive decline. Nature Reviews Neuroscience , [online] 13(4), pp.240–250. doi: https://doi.org/10.1038/nrn3200 . Paspalas, C.D., Carlyle, B.C., Leslie, S., Preuss, T.M., Crimins, J.L., Huttner, A.J., Dyck, C.H., Rosene, D.L., Nairn, A.C. and Arnsten, A.F.T. (2017). The aged rhesus macaque manifests Braak stage III/IV Alzheimer’s‐like pathology. Alzheimer’s & Dementia , [online] 14(5), pp.680–691. doi: https://doi.org/10.1016/j.jalz.2017.11.005 . Shi, L., Luo, X., Jiang, J., Chen, Y., Liu, C., Hu, T., Li, M., Lin, Q., Li, Y., Huang, J., Wang, H., Niu, Y., Shi, Y., Styner, M., Wang, J., Lu, Y., Sun, X., Yu, H., Ji, W. and Su, B. (2019). Transgenic rhesus monkeys carrying the human MCPH1 gene copies show human-like neoteny of brain development. National Science Review , [online] 6(3), pp.480–493. doi: https://doi.org/10.1093/nsr/nwz043 . Project Gallery
- Antisense oligonucleotide gene therapy for treating Huntington's disease | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Antisense oligonucleotide gene therapy for treating Huntington's disease 20/03/24, 18:10 A potential gene therapy Huntington’s disease (HD) is an inherited neurodegenerative disease caused by a CAG extension in exon 1 of the huntingtin gene. An extended polyglutamine tract in the huntingtin protein is developed due to the expanded alleles, resulting in intracellular signalling defects. Antisense Oligonucleotide (ASO) gene therapy is currently being pioneered to treat HD. In this therapy, oligonucleotides are inserted into cells and bind to the target huntingtin mRNA. Thus, inhibiting the formation of the huntingtin protein by either physically blocking the translation of mRNA (figure 1) or by utilising RNase H to degrade the mRNA. Previous ASO gene therapy experiments conducted on R6/2 mice that express the human huntingtin gene have been successful. In HD research, the R6/2 mouse model is commonly used to replicate HD symptoms and is therefore useful for testing potential treatments. The transgenic R6/2 mouse has an N-terminally mutant Huntingtin gene with a CAG repeat expansion within exon 1. In this successful experiment, scientists treated one group of R6/2 mice with the ASO treatment that suppresses the production of human huntingtin mRNA, and saline solution was administered to the control group of mice. This experiment aimed to confirm if ASO therapy improves the survival rate in the R6/2 mice. The results showed that human huntingtin mRNA levels of the mice treated with ASO therapy were lower than the control group. Furthermore, the mice treated with ASO therapy had a higher percentage of survival and lived longer (21 weeks), in comparison to the control group mice that survived until 19 weeks. Thus, it could be concluded that if less human huntingtin mRNA was present in the ASO group, then less human huntingtin mRNA would be translated, and so there would be less synthesis of the huntingtin protein, in contrast to the control group. The results of this study are enormously informative in understanding how gene therapy can be used in the future to treat other neurological diseases. However, before ASO therapy is approved for clinical use, further trials will need to be conducted in humans to verify the same successful outcomes as the R6/2 mice. If approved, then the symptoms of HD, including dystonia could be safely controlled with ASO therapy. Furthermore, scientists need to consider that an increased survival rate of only an additional two weeks, as shown in the experiment does not always correlate to an increased quality of life for the patient. Therefore, it needs to be established if the benefits of ASO gene therapy will outweigh the risks associated with it. Furthermore, the drug PBT2, which influences copper interactions between abnormal proteins, is currently being studied as a potential treatment option for HD. Some studies have inferred that the aggregation of mutant huntingtin proteins could be due to interactions with metals, including copper. Therefore, this drug is designed to chelate metals and consequently, decrease abnormal protein aggregations in the body. This treatment has been shown to improve motor tasks and increase the lifespan in R6/2 mice. However, as this treatment has a lot of shortcomings, further studies need to be conducted over a large period of time to confirm a successful outcome of this drug on HD patients. Written by Maria Z Kahloon References: Kordasiewicz HB, Stanek LM, Wancewicz EV, Mazur C, McAlonis MM, Pytel KA, et al. Sustained therapeutic reversal of Huntington’s disease by transient repression of huntingtin synthesis. Neuron. 2012;74(6):1031–44. Valcárcel-Ocete L, Alkorta-Aranburu G, Iriondo M, Fullaondo A, García-Barcina M, Fernández-García JM, et al. Exploring genetic factors involved in Huntington disease age of onset: E2F2 as a new potential modifier gene. PLoS One. 2015;10(7):e0131573. Liou S. Antisense gene therapy [Internet]. Stanford.edu . 2010 [cited 2021 Aug 6]. Available from: https://hopes.stanford.edu/antisense-gene-therapy/ Huntington's disease research study in R6/2 MOUSE model: Charles River [Internet]. Charles River Labs. [cited 2021 Aug 26]. Available from: https://www.criver.com/products-services/discovery-services/pharmacology-studies/neuroscience-models-assays/huntingtons-disease-studies/r62-mouse?region=3696 Frank S. Treatment of Huntington's disease. Neurotherapeutics : the journal of the American Society for Experimental NeuroTherapeutics. Springer US; 2014;11(1):153-160. Potkin KT, Potkin SG. New directions in therapeutics for HUNTINGTON DISEASE. Future neurology. 2018;13(2):101-121. Project Gallery
- Anaemia | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Anaemia 03/06/24, 14:57 A disease of the blood This is article no. 1 in a series about anaemia. Next article: iron-deficiency anaemia Introduction Erythrocytes in their typical state are a biconcave and nucleus free cell responsible for carrying oxygen and carbon dioxide. The production is controlled by erythropoietin and as they mature in the bone marrow, they lose their nuclei. These red blood cells (RBC) contain haemoglobin, which aids in the transport of oxygen and iron, iron is a key component of haem, insufficient levels of iron leads to anaemic disorders. Low oxygen-carrying capacity may be defined by too few RBC in circulation or RBC dysfunction. Haem iron is acquired through the digestion of meat and transported through enterocytes of the duodenum, in its soluble form. Erythrocytic iron accounts for approximately 50% of the iron in blood. Metals cannot move freely throughout the body so they must be transported, the molecule involved in transporting iron is known as transferrin. Plasma transferrin saturation refers to the iron that is attached to transferrin, in iron deficient anaemia (IDA) this will always be low. Anaemia is physiological or pathological, these changes can be due to a plethora of causes; malabsorption due to diet or gastrointestinal (GI) conditions, genetic dispositions such as sideroblastic anaemias (SA), thalassaemia, or deficiency in erythropoietin due to comorbidities and chronic disease; where haemolysis is caused by autoimmune disorders, infections and drugs, or blood loss. Haem The iron is in a protoporphyrin ring at the centre of a haem molecule. The structure of haem consists of two alpha and two beta polypeptide chains to form a single haemoglobin macromolecule. Microcytic anaemias arise from problems in the creation of haemoglobin; sourcing through diet (IDA), synthesising protoporphyrin (SA) or from globin chain defects caused by thalassaemia. Summary Anaemia is a multifactorial condition with many different mechanisms involved, microcytic anaemias have an issue at the haemoglobin level, these can be acquired or inherited. A microcytic anaemia is caused by a failure to efficiently synthesise haemoglobin, whether from iron, protoporphyrin rings or globin chains. The diagnosis of anaemias is reliant on a patient’s background and medical history, as there are many factors involved in an anaemic disorder. A diagnosis should be patient led, as the age and sex of the patient can significantly highlight the origin and pathogenesis, as well as the prognosis and follow up care. By Lauren Kelly Project Gallery
- From botulism to beauty: the evolution of botulinum toxins and botox | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link From botulism to beauty: the evolution of botulinum toxins and botox 24/09/24, 11:03 How botox works in the cosmetic industry Botulinum neurotoxins (BoNTs) rank amongst the most potent and lethal neurotoxins known to science. Yet, it's a fascinating journey to discover how these deadly substances have found their way into one of the most renowned cosmetic procedures in the world: Botox. BoNTs originate from the bacterium Clostridium botulinum, which produces some of the most potent neurotoxins in existence. They are central to the development of botulism, a condition that relentlessly targets the body's nervous system, resulting in challenges in breathing and muscle paralysis. Despite their perilous origins, these toxins have undergone a fascinating metamorphosis into a popular cosmetic procedure. They have been studied substantially due to their ability to block nerve functions leading to muscle paralysis and their unique pharmacological properties in therapeutic and cosmetic uses. They affect the neurotransmission process by blocking the release of acetylcholine that allows muscle contraction in the body. The toxins bind pre-synaptically to recognition sites on cholinergic nerve terminals resulting in the inhibition of neurotransmitter release. The toxin consists of a heavy chain and a light chain connected by a disulphide bond. This disulphide bond is vital in the entry of the metalloprotease chain in the cytosol. BoNTs have a unique binding characteristic as a dual receptor binder, which allows them to achieve a high affinity for neurons. These proteins possess the remarkable ability to specifically target and interfere with the neurotransmission process. At their core, BoNTs are proteases, enzymes specialised in cleaving specific proteins involved in nerve signal transmission. When administered as Botox, BoNTs are skillfully harnessed to their advantage due to these properties. By injecting small, controlled amounts into specific facial muscles, they temporarily disrupt the nerve signals that stimulate muscle contraction. This action leads to muscle relaxation, smoothing out wrinkles and lines on the skin's surface. Importantly, the effects are localised, preserving the natural expressiveness of the face. In 1989, BoNTs made their debut in the medical community by being recognised as a safe and effective treatment by the FDA for blepharospasm, which affects eye muscle control. However, in 2002 the FDA extended its endorsement, propelling Botox into the realm of beauty. This pivotal decision forever reshaped the landscape of cosmetic procedures, solidifying Botox's status as an iconic treatment for rejuvenation and enhancement. In conclusion, the evolution of botulinum toxins and the rise of Botox is a captivating journey that traverses the realms of science, medicine, and evolving beauty ideals. By Anam Ahmed Project Gallery
- Delving into the world of chimeras | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Delving into the world of chimeras 17/03/24, 17:56 An exploration of this genetic concept The term chimera has been borrowed from Greek mythology, transcending ancient tales to become a captivating concept within the fields of biology and genetics. In mythology, the chimera was a monstrous hybrid creature. However, in the biological context, a chimera refers to an organism with cells derived from two or more zygotes. While instances of natural chimerism exist within humans, researchers are pushing the boundaries of genetics via the intentional creation of chimeras, consequentially sparking debates and breakthroughs in various fields, spanning from medicine to agriculture. Despite the theory that every cell in the body should share identical genomes, chimeras challenge this notion. For example, the fusion of non-identical twin embryos in the womb is a way chimeras can emerge. While visible cues, such as heterochromia or varied skin tone patches, may provide subtle hints of its existence, often individuals with chimerism show no overt signs, making its prevalence uncertain. In cases where male and female cells coexist, abnormalities in reproductive organs may exist. Furthermore, advancements in genetic engineering and CRISPR genome editing have also allowed the artificial creation of chimeras, which may aid medical research and treatments. In 2021, the first human-monkey chimera embryo was created in China to investigate ways of using animals to grow human organs for transplants. The organs could be genetically matched by taking the recipient’s cells and reprogramming them into stem cells. However, the process of creating a chimera can be challenging and inefficient. This was shown when researchers from the Salk Institute in California tried to grow the first embryos containing cells from humans and pigs. From 2,075 implanted embryos, only 186 developed up to the 28-day time limit for the project. Chimeras are not exclusive to the animal kingdom; plants exhibit this genetic complexity as well. The first non-fictional chimera, the “Bizzaria” discovered by a Florentine gardener in the seventeenth century, arose from the graft junction between sour orange and citron. Initially thought to be an asexual hybrid formed from cellular fusion, later analyses revealed it to be a chimera, a mix of cells from both donors. This pivotal discovery in the early twentieth century marked a turning point, shaping our understanding of chimeras as unique biological phenomena. Chimera is a common form of variegation, with parts of the leaf appearing to be green and other parts white. This is because the white or yellow portions of the leaf lack the green pigment chlorophyll, which can be traced to layers in the meristem (areas found at the root and shoot tip that have active cell division) that are either genetically capable or incapable of making chlorophyll. As we conclude this exploration into the world of chimeras, from the mythological realm to the scientific frontier, it’s evident that these entities continue to mystify and inspire, broadening our understanding of genetics, development, and the interconnectedness of organisms. Whether natural wonders or products of intentional creation, chimeras beckon further exploration, promising a deeper comprehension of the fundamental principles that govern the tapestry of life. Written by Maya El Toukhy Related article: Micro-chimerism and George Floyd's death Project Gallery
- Brief neuroanatomy of autism | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Brief neuroanatomy of autism 10/10/24, 10:28 Differences in brain structure Autism is a neurodevelopmental condition present in both children and adults worldwide. The core symptoms include difficulties understanding social interaction and communication, and restrictive or repetitive behaviours such as strict routines and stimming. When the term autism was first coined in the 20th century, it was thought of as a disease. However, it is now described as a cognitive difference rather than a disease; that is, the brains of autistic individuals – along with people diagnosed with dyslexia, dyspraxia, or attention deficit hyperactive disorder – are not defective, but simply wired differently. The exact cause or mechanism for autism has not been determined; the symptoms are thought to be brought about by a combination of genetic and environmental factors. Currently, autism disorders are diagnosed solely by observing behaviours, without measuring the brain directly. However, behaviours may be seen as the observable consequence of brain activity. So, what is it about their brains that might make autistic individuals behave differently to neurotypicals? Total brain volume Back before sophisticated imaging techniques were in use, psychiatrics had already observed the head size of autistic infants was often larger than that of other children. Later studies provided more evidence that most children who would go on to be diagnosed had a normal-sized head at birth, but an abnormally large circumference by the time they had turned 2 to 4 years old. Interestingly, increase in head size has been found to be correlated with the onset of main symptoms of autism. However, after childhood, growth appears to slow down, and autistic teenagers and adults present brain sizes comparable to those of neurotypicals. The amygdala As well transient increase of total brain volume, the size and volume of several brain structures in particular seems to differ between individuals with and without autism. Most studies have found that the amygdala, a small area in the centre of the brain that mediates emotions such as fear, appears enlarged in autistic children. The amygdala is a particularly interesting structure to study in autism, as individuals often have difficulty interpreting and regulating emotions and social interactions. Its increased size seems to persist at least until early adolescence. However, studies in adolescents and adults tend to show that the enlargement slows down, and in some cases is even reversed so that the number of amygdala neurons may be lower than normal in autistic adults. The cerebellum Another brain structure that tends to present abnormalities in autism is the cerebellum. Sitting at the back of the head near the spinal cord, it is known to mediate fine motor control and proprioception. Yet, recent literature suggests it may also play an important role in some higher other cognitive functions, including language and social cognition. Specifically, it may be involved in our ability to imagine hypothetical scenarios and to abstract information from social interactions. In other words, it may help us recognise similarities and patterns in past social interactions that we can apply to understand a current situation. This ability is poor in autism; indeed, some investigations have found the volume of the cerebellum may be smaller in autistic individuals, although research is not conclusive. Nevertheless, most research agrees that the number of Purkinje cells is markedly lower in people with autism. Purkinje cells are a type of neuron found exclusively in the cerebellum, able to integrate large amounts of input information into a coherent signal. They are also the only source of output for the cerebellum; they are responsible for connecting the structure with other parts of the brain such as the cortex and subcortical structures. These connections eventually bring about a specific function, including motor control and cognition. Therefore, a low number of Purkinje cells may cause underconnectivity between the cerebellum and other areas, which might be the reason for functions such as social cognition being impaired in autism. Written by Julia Ruiz Rua Related article: Epilepsy Project Gallery
- The role of chemistry in space exploration | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The role of chemistry in space exploration 21/11/24, 12:09 How chemistry plays a part Background Space exploration is without a doubt one of the most intriguing areas of science. As humans, we have a natural tendency to investigate everything around us – with space, the main question we want to answer is if there is life beyond us on Earth. Astronomers use advanced telescopes to help look for celestial objects and therefore study their structures, to get closer in finding a solution to this question. However, astronomers do have to communicate with other scientists in doing so. After all, the field of science is all about collaboration. One example is theoretical physicists studying observed data and, as the name suggests, come up with theories using computational methods for other scientists to examine experimentally. In this article, we will acknowledge the importance of chemistry in space exploration, from not only studying celestial bodies but also to life support technology for astronauts and more. Examples of chemistry applications 1) Portable life support systems To survive in space requires advanced and well-designed life support systems due to being exposed to extreme temperatures and conditions. Portable life support systems (PLSS) are devices connected to an astronaut’s spacesuit that supplies oxygen as well as removal of carbon dioxide (CO2). The famous apollo lunar landing missions had clever PLSS – they utilised lithium hydroxide to remove CO2 and liquid cooling garments, which used any water to remove heat from breathing air. However, these systems are large and quite bulky, so hopefully we can see chemistry help us design even more smart PLSS in the future. 2) Solid rocket propulsion systems Chemical propellants in rockets eject reaction mass at high velocities and pressure using a source of fuel and oxidiser, causing thrust in the engine. Simply put, thrust is a strong force that causes an object to move – in this case, a rocket launching into space. Advancements in propellant chemistry has allowed greater space exploration to take place due to more efficient and reliable systems. 3) Absorption spectroscopy Electromagnetic radiation is energy travelling at the speed of light (approx. 3.0 x 108 m/s!) that can interact with matter. This radiation consists of different wavelengths and frequencies, with longer wavelengths possessing shorter frequencies and vice versa. Each molecule has unique absorption wavelength(s) – this means that if specific wavelengths of radiation ‘hits’ a substance, electrons in the ground state will become excited and can jump up to higher energy states. A line appears in the absorption spectrum for every excited electron (see Figure 1 ). As a result, spectroscopic analysis of newly discovered planets or moons can give us information on the different elements that are present. It should also be noted that the excited electrons will relax back down to the ground state and emit a photon, allowing us to observe emission spectra as well. In the emission spectra, the lines would be in the exact same place as those in the absorption, but coloured in a black background (see Figure 2 ). Fun fact: There are six essential elements needed for life – carbon, hydrogen, nitrogen, oxygen, phosphorus and sulfur. In 2023, scientists concluded that Saturn’s moon Enceladus has all these which indicates that life could be present here! 1) Space medicine Whilst many people are fascinated by the idea of going to space, it is definitely not an easy task as the body undergoes more stress and changes than one can imagine. For example, barotrauma is when tissues filled with air space due to differences in pressure between the body and ambient atmosphere becomes injured. Another example is weakening of the immune system, as researchers has been found that pre-existing T cells in the body were not able to fight off infection well. However, the field of space medicine is growing and making sure discomforts like those above are prevented where possible. Space medicine researchers have developed ‘countermeasures’ for astronauts to follow, such as special exercises that maintain bone/muscle mass as well as diets. Being in space is isolating which can cause mental health problems, so early-on counselling and therapy is also being provided to prevent this. To conclude Overall, chemistry plays a vital role in the field of space exploration. It allows us to go beyond just analysis of celestial objects as demonstrated in this article. Typically, when we hear the word ‘chemistry’ we often just think of its applications in the medical field or environment, but its versatility should be celebrated more often. Written by Harsimran Kaur Related articles: AI in space / The role of chemistry in medicine / Astronauts in space Project Gallery
- A concise introduction to Markov chain models | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link A concise introduction to Markov chain models 03/06/24, 14:53 How do they work? Introduction A Markov chain is a stochastic process that models a system that transitions from one state to another, where the probability of the next state only depends on the current state and not on the previous history. For example, assuming that X 0 is the current state of a system or process, the probability of a state, X 1 , depends only on X 0 which is of course the current state of the system as stated. P ( X 1 ) = f ( P ( X 0 )) It may be hard to think of any real-life processes that follow this behaviour because there is the belief that all events happen in a sequence because of each other. Here are some examples: Games e.g. chess - If your king is in a certain spot on a chess board, there will be a maximum of 4 transition states that can be achieved that all depend on the initial position of chess piece. The parameters for the Markov model will obviously vary depending on your position on the board which is the essence of the Markov process. Genetics - The genetic code of an organism can be modelled as a Markov chain, where each nucleotide (A, C, G, or T) is a state, and the probability of the next nucleotide depends only on the current one. Text generation - Consider the current state as the most recent word. The transition states would be all possible words which could follow on from said word. Next word prediction algorithms can utilize a first-order Markov process to predict the next word in a sentence based on the most recent word. The text generation example is particularly interesting because only considering the previous word when trying to predict the next word sentence would lead to a very random sentence. That is where we can change things up using various mathematical techniques. k -Order Markov Chains (adding more steps) In a first-order Markov chain, we only consider the immediately preceding state to predict the next state. However, in k-order Markov chains, we broaden our perspective. Here’s how it works: Definition: a k-order Markov chain considers the previous states (or steps) when predicting the next state. It’s like looking further back in time to inform our predictions. Example: suppose we’re modelling the weather. In a first-order Markov chain, we’d only look at today’s weather to predict tomorrow’s weather. But in a second-order Markov chain, we’d consider both today’s and yesterday’s weather. Similarly, a third-order Markov chain would involve three days of historical data. By incorporating more context, k-order chains can capture longer-term dependencies and patterns. As k increases, the model becomes more complex, and we need more data to estimate transition probabilities accurately. See diagram below for a definition of higher order Markov chains. Markov chains for Natural Language Processing A Markov chain can generate text by using a dictionary of words as the states, and the frequency of words in a corpus of text as the transition probabilities. Given an input word, such as "How", the Markov chain can generate the next word, such as "to", by sampling from the probability distribution of words that follow "How" in the corpus. Then, the Markov chain can generate the next word, such as "use", by sampling from the probability distribution of words that follow "to" in the corpus. This process can be repeated until a desired length or end of sentence is reached. That is a basic example and for more complex NLP tasks we can employ more complex Markov models such as k-order, variable, n-gram or even hidden Markov models. Limitations of Markov models Markov models for tasks such as text generation will struggle because they are too simplistic to create text that is intelligent and sometimes even coherent. Here are some reasons why: Fixed Transition Probabilities: Markov models assume that transition probabilities are constant throughout. In reality, language is dynamic, and context can change rapidly. Fixed probabilities may not capture these nuances effectively. Local Dependencies: Markov chains have local dependencies, meaning they only consider a limited context (e.g., the previous word). They don’t capture long-range dependencies or global context. Limited Context Window: Markov models have a fixed context window (e.g., first-order, second order, etc.). If the context extends beyond this window, the model won’t capture it. Sparse Data : Markov models rely on observed data (transition frequencies) from the training corpus. If certain word combinations are rare or absent, the model struggles to estimate accurate probabilities. Lack of Learning: Markov models don’t learn from gradients or backpropagation. They’re based solely on observed statistics. Written by Temi Abbass FURTHER READING 1. “Improving the Markov Chain Approach for Generating Text Used for…” : This work focuses on text generation using Markov chains. It highlights the chance based transition process and the representation of temporal patterns determined by probability over sample observations . 2 . “Synthetic Text Generation for Sentiment Analysis” : This paper discusses text generation using latent Dirichlet allocation (LDA) and a text generator based on Markov chain models. It explores approaches for generating synthetic text for sentiment analysis . 3. “A Systematic Review of Hidden Markov Models and Their Applications” : This review paper provides insights into HMMs, a statistical model designed using a Markov process with hidden states. It discusses their applications in various fields, including robotics, finance, social science, and ecological time series data analysis . Project Gallery