Search Index
278 items found
- Will diabetes mellitus become the next epidemic? | Scientia News
Go back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Will diabetes mellitus become an epidemic? Last updated: 07/11/24 Defining diabetes and its causes Looking at modern society in terms of the food being consumed and the amount of exercise undertaken collectively, it is entirely inevitable that diabetes will become an epidemic. Now before delving into the above statement further, diabetes mellitus (from Greek ‘siphon’ and Latin ‘sweet’) is a non-communicable disease that occurs when blood sugar levels in the body are so high, that the pancreas is unable to produce adequate insulin in order to manage this problem. Also, diabetes can be categorised into various types, but the most common are types 1 and 2 as well as gestational (which happens during pregnancy). There is also diabetes insipidus (from Latin ‘lacking taste’), and this is where the kidneys are unable to conserve water. The causes of diabetes mellitus can be divided based on the type. Since type 1 can be caused by the body’s immune system attacking the pancreas, this means that the beta cells are unable to make enough insulin because they are damaged. Not only can type 1 diabetes arise this way, it is possible that environmental factors such as diet and viral infections lead to the disease. As for type 2, it primarily comes from insulin resistance, meaning that the body does not respond to the hormone effectively compared to a person without diabetes. This in turn impacts insulin mediated glycogen synthesis and glycolysis leading to hyperglycemia as seen in figure 1. There are many reasons why diabetes is likely to become an epidemic. Firstly, there is a clear connection between obesity and type 2 diabetes which cannot be ignored; this is because an article found that people with both conditions are exacerbated perhaps due to increased NEFAs (non esterified fatty acids) and glycerol among other linked biochemicals. On the other hand, this same article stated that people with type 1 diabetes are not usually obese. Nevertheless, it is vital that in order to prevent the incidence of type 2 diabetes in later life, it is important to implement strategies such as regular exercise and lowering carbohydrate intake in the diet. Alluding to the previous paragraph, one of the major factors to the increase in obesity and type 2 diabetes diagnoses is the sedentary lifestyle or decreased mobility through sitting. A meta-analysis evaluated 10 studies with over 500,000 volunteers and concluded that there was a 112% cumulative increase in type 2 diabetes risk linked to watching TV. Additionally, a study showed that more sedentary time had raised body and trunk fat percentage while there was reduced appendicular skeletal muscle mass. Taking into account these findings among others, it is evident that exercise does play a role in reducing the risk of type 2 diabetes. Counteracting the previous paragraphs, it is equally plausible that diabetes will not be epidemic because there are current pharmaceutical drugs taken orally like sulfonylureas and meglitinides that cause the pancreas to release insulin aside from injection based ones such as amylin mimetics, which maintains blood glucose concentration, which are used for type 2 diabetes. As for those afflicted with type 1 diabetes, they mainly take insulin because they are in deficit of the hormone or they can have a pancreatic transplant, which has more than 96% and 83% survival rates after 1 and 5 years of the operations respectively, although it does have a major complication of rejection like any other type of operation. With regards to future treatments, a review discussed how newer drugs for decreasing blood glucose such as dipeptidyl peptidase-4 (DPP-4) inhibitors have been re-evaluated for cardiovascular outcome trials by showing patients experiencing a decrease in other non-communicable diseases like myocardial infarction and albuminuria, indicating that they can be useful for heart and kidney diseases associated with type 2 diabetes. Furthermore, there are other potential therapies such as probiotics and prebiotics that can be used along with faecal transplants to change the gut microbiome for type 2 diabetes patients. It is uncertain that diabetes will/won’t become an epidemic From a more neutral perspective, there is not enough certainty that diabetes will or will not become an epidemic simply because accurately predicting the future 100% of the time is impossible. As such, the future interventions for treating diabetes may not actually get to exist, perhaps due to prospective factors like politics and societal values with respect to science as well as taking into account the difficulty for a therapeutic method to be put onto the market for the patients to consider. Another point to address is the fact that the human body is so incredibly complex that it took humans thousands of years to truly discover all of the current facts known in relation to its anatomy and physiology along with having some level of understanding of them. Not only that, there are still observations about the human body that are still unclear to scientists today and so the drugs for treating diabetes may or may not be effective depending on who is receiving the therapy because each person is genetically unique. Conclusion Referring to all of the arguments made, it is evident that diabetes is a huge burden for modern and future societies because of its links to obesity or sedentary lifestyle and consuming foods high in carbohydrates. Yet, this issue may be prevented by exploring future therapies, exploiting current ones and implementing non-clinical interventions such as increased regular exercise and reducing carbohydrate intake. Therefore, it is the responsibility of each patient and health organisation to manage diabetes before it becomes even worse. Written by Sam Jarada Related articles: Pre-diabetes / Diabetes drug to treat Parkinson's / The world vs the next pandemic REFERENCES Diabetes UK. Types of diabetes. Diabetes UK. 2022. Paschou SA, Papadopoulou-Marketou N, Chrousos GP, Kanaka-Gantenbein C. On type 1 diabetes mellitus pathogenesis. Endocrine Connections. 2018 Jan;7(1):R38–46. Cersosimo E, Triplitt C, Solis-Herrera C, Mandarino LJ, DeFronzo RA. Pathogenesis of Type 2 Diabetes Mellitus. Nih.gov. MDText.com, Inc.; 2018. Algoblan A, Alalfi M, Khan M. Mechanism linking diabetes mellitus and obesity. Diabetes, Metabolic Syndrome and Obesity: Targets and Therapy. 2014 Dec;7(587–591):587. Barnes AS. The epidemic of obesity and diabetes: trends and treatments. Texas Heart Institute journal. 2011;38(2):142–4. Hamilton MT, Hamilton DG, Zderic TW. Sedentary Behavior as a Mediator of Type 2 Diabetes. Medicine and Sport Science. 2014;60:11–26. Li D, Yang Y, Gao Z, Zhao L, Yang X, Xu F, et al. Sedentary lifestyle and body composition in type 2 diabetes. Diabetology & Metabolic Syndrome. 2022 Jan 15;14(1). Mayo Clinic. Diabetes treatment: Medications for type 2 diabetes. Mayo Clinic. 2018. Bahar SG, Devulapally P. Pancreas Transplantation. PubMed. Treasure Island (FL): StatPearls Publishing; 2022. Bailey CJ, Day C. The future of new drugs for diabetes management. Diabetes Research and Clinical Practice. 2019 Sep;155:107785. Bailey CJ, Day C. Treatment of type 2 diabetes: future approaches. British Medical Bulletin. 2018 Jun 1;126(1):123–37.
- The story of pigments and dyes | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The story of pigments and dyes 24/09/24, 13:21 Last updated: A chemist's palette Pigments and dyes are vital in producing vibrancy and changing colours in our surroundings. Their vast use in cosmetics, pharmaceuticals, inks and textiles makes them important in playing a crucial role in creating the colourful world we see around us. But how do they come into existence? It all started from the extraction of colours from the world around us, such as green chlorophyll found in leaves and reds from berries. They were used to decorate caves and clothes in early civilization. However, when synthetic dyes came into play in the 19th century, things took an advance. Mauveine was accidentally discovered by William Henry Perkin; its vivid purple colour proved that we could make complicated organic substances from simpler ones, challenging the idea that organic compounds could only come from living things or nature. How does chemistry relate to the colours produced? Well, the way molecules are built fundamentally decides what colours are visible. In summary, the colours we see are a result of electrons in atoms and molecules absorbing and then releasing energy in the form of light. The specific colours are determined by the amount of energy released and the unique arrangement of electrons in each substance. In chemistry, pigments and dyes are used in various applications such as indicators in chemical reactions, chromatography, photovoltaic cells and most commonly in titration. They enable researchers to explore chemical processes and analyse substances. However, there are many environmental concerns regarding synthetic dyes, with pollution and water contamination. Synthetic dyes may also contain chemicals and additives that are toxic to aquatic life, posing risks to the environment. To address these issues, regulations, research into eco-friendly alternatives, sustainable practices, and educating people on this is important. In essence, we are constantly reminded of the evolving relationship between colours and chemistry. In the future, as more materials change colours and new uses are discovered, chemists will continue to be fascinated by the endless possibilities. Written by Anam Ahmed Project Gallery
- Exploring Ibuprofen | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Exploring Ibuprofen 24/09/24, 11:00 Last updated: Its toxicodynamics, and balancing benefits and risks What is Ibuprofen? Ibuprofen is a standard over-the-counter medicine which can be bought from supermarkets and pharmacies. It is primarily used for pain relief, such as back pain, period pain, toothaches, etc. It can also be used for arthritis pain and inflammation. It is available in various forms, including tablets, capsules, gels, and sprays for the skin. The Toxicodynamics of Ibuprofen Toxicodynamics refers to the biological effects of a substance after exposure to it. Scientists look at the mechanisms by which the substance produces toxic effects and the target organs or tissues it affects. Ibuprofen works by stopping the enzymes that synthesise prostaglandins, which are a group of lipid molecules that cause inflammation, including symptoms like redness, heat, swelling and pain. Therefore, after the action of Ibuprofen, inflammatory responses and pain are reduced. Ibuprofen targets organs and tissues, including the gastrointestinal tract, the kidneys, the central nervous system, blood and more. Balancing the Benefits and Risks Ibuprofen’s method of action means it is a safe and effective pain relief medication for most people. It is also easily accessible and easy to use. However, it is able to affect the target organs and tissues negatively and, therefore, can have serious side effects, especially if taken for an extended period of time and/or in high doses. They include heartburn, abdominal pain, kidney damage (especially for people who already have kidney problems), low blood count and more. Therefore, it is important to use Ibuprofen responsibly. This can be done by understanding and being well-informed about its effects on the body, particularly its impact on organs and tissues. With caution and proper use, the side effects can be minimised. One of the easiest ways to lessen side effects is by taking the medication with food. Additionally, patients should take the lowest effective dose for the shortest possible time. If patients have a history of stomach problems, avoiding Ibuprofen and using alternatives is the best solution. Patients can also talk to their GP if they are concerned about the side effects and report any suspected side effects using the Yellow Card safety scheme on the NHS website. Links to find out more: https://www.nhs.uk/medicines/ibuprofen-for-adults/about-ibuprofen-for-adults/ https://www.sciencedirect.com/topics/pharmacology-toxicology-and-pharmaceutical-science/toxicodynamics https://www.chm.bris.ac.uk/motm/ibuprofen/ibuprofenh.htm https://www.ncbi.nlm.nih.gov/books/NBK526078/ Written by Naoshin Haque Project Gallery
- Understanding diverticular disease | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Understanding diverticular disease 09/01/25, 12:17 Last updated: The prevalence of diverticulosis is increasing in developed countries Diverticulosis, diverticula, diverticulum, and diverticulitis - they may sound similar, but each term describes a specific aspect of diverticular disease. Before diving into diverticular disease, let’s clarify these key terms: Diverticulum: a small, bulging pouch that forms in a weak spot in the lining of the large intestine. Diverticula: the plural form of a diverticulum, indicating multiple bulging pouches in the large intestine's lining. Diverticulosis: a condition where multiple diverticula are present in the large intestine. Diverticulitis: this occurs when one or more diverticula become inflamed or infected. What is diverticular disease? Diverticular disease can be broadly categorised into two main conditions: diverticulosis and diverticulitis. Both involve the presence of diverticula in the colon, but the key difference lies in inflammation. In diverticulitis, the diverticula become inflamed or infected, leading to symptoms. On the other hand, diverticulosis is typically asymptomatic. However, there is a third condition, referred to as symptomatic uncomplicated diverticular disease (SUDD), where diverticula are present without inflammation, but the patient still experiences symptoms. The prevalence of diverticulosis is increasing in developed countries, largely due to the typical 'Western diet', which is high in red meat and low in fibre. Additionally, lifestyle factors such as obesity, smoking, and physical inactivity contribute to this rise. Age is also a significant factor, with 85% of diverticulosis cases occurring in individuals over the age of 50. Pathophysiology The formation of diverticula in the colon is primarily due to three factors: structural abnormalities in the colonic wall, disordered intestinal motility, and a deficiency of dietary fibre. The large intestine has two layers of muscle that work together to move its contents: an inner circular layer and an outer longitudinal layer. The outer layer consists of three bands called the taeniae coli, which run longitudinally along the colon. The gaps between these muscle bands are areas of weakness, making them vulnerable to the development of diverticula. Age-related weakening of the connective tissue further increases the risk of developing diverticula in these vulnerable areas. In some patients, abnormal gut motility can lead to areas of high pressure in the bowel, causing the mucosa to bulge outward, forming diverticula. Similarly, a lack of fibre in the diet can increase bowel pressure and lead to irregular movement, which also promotes outpouching. As we've discussed, some patients with diverticula may remain asymptomatic, while others experience varying levels of discomfort. The transition from diverticulosis to diverticulitis occurs when undigested food or a fecalith becomes trapped in these pouches, causing a blockage. This leads to bacterial growth and multiplication, resulting in infection and inflammation of the pouch. Symptoms Diverticular disease comes with a range of symptoms, some of which are quite common and could be easily mistaken for other conditions. General symptoms like nausea, vomiting, diarrhoea, and fever often overlap with other digestive problems, making diagnosis tricky. However, certain symptoms can hint more strongly at diverticular disease. For instance, experiencing pain in the lower left side of the abdomen (known as the left iliac fossa) or noticing rectal bleeding are more specific indicators that may point towards this condition. Recognising these symptoms can help in getting a more accurate diagnosis and appropriate treatment. Management Managing diverticular disease depends on the individual patient and the severity of their symptoms. For some, simple, conservative treatments are enough—this might include staying hydrated, eating a high-fibre diet, and giving the bowel a short rest by temporarily avoiding food. However, if a patient is experiencing significant pain or signs of infection, medical treatment is necessary. This may involve pain relief based on the WHO pain ladder or antibiotics to tackle the infection. In more serious cases, where other treatments haven’t worked or the patient is in a life-threatening situation, surgery might be required. A common procedure for these severe cases is the Hartmann’s procedure. This surgery removes the damaged section of the large intestine, usually due to infection or blockage. The healthy end of the intestine is brought out through an opening in the abdomen, creating a temporary colostomy that allows waste to leave the body through a bag. This setup gives the intestine time to heal, and in some cases, a follow-up surgery can reconnect it for normal function. Complications There are both short-term and long-term complications associated with diverticulitis, particularly in more severe cases that require more aggressive treatment such as surgery (see Figure 4 ). Future directions Recent changes in the management of diverticulitis have shifted how clinicians approach treatment. One significant update involves the use of antibiotics. Traditionally, diverticulitis was treated with routine antibiotic prescriptions. However, newer guidelines suggest that antibiotics may not be necessary for uncomplicated cases, helping to reduce both antibiotic resistance and the potential medication side effects for patients. Another emerging trend is treating uncomplicated diverticulitis on an outpatient basis. This allows patients to be managed at home with pain relief and dietary adjustments, which in turn frees up hospital resources for those with more severe conditions. Additionally, the management of complicated diverticulitis has evolved. For instance, abscesses may now be treated with percutaneous drainage rather than resorting to emergency surgery. Conclusion In summary, diverticular disease can vary widely in its symptoms and required treatments, ranging from dietary changes to surgical interventions for severe cases. Identifying specific signs and understanding the treatment options can empower patients and help them make informed choices. Advances in treatment approaches are also helping to improve outcomes and quality of life for those affected. Written by Abbasali Gulamhussein Related articles: Crohn's disease / The gut microbiome REFERENCES Cater, M. (2023). Foods for Diverticulosis and Diverticulitis . [online] www.hopkinsmedicine.org . Available at: https://www.hopkinsmedicine.org/health/wellness-and-prevention/foods-for-diverticulosis-and-diverticulitis . Matrana, M.R. and Margolin, D.A. (2009a) Epidemiology and pathophysiology of diverticular disease , Clinics in colon and rectal surgery . Available at: https://pmc.ncbi.nlm.nih.gov/articles/PMC2780269/ (Accessed: 12 October 2024). Miller, A.S. et al. (2021) The Association of Coloproctology of Great Britain and Ireland consensus guidelines in emergency colorectal surgery , Colorectal disease : the official journal of the Association of Coloproctology of Great Britain and Ireland . Available at: https://pmc.ncbi.nlm.nih.gov/articles/PMC9291558/ (Accessed: 12 October 2024). NHS (2019). Diverticular disease and diverticulitis . [online] NHS. Available at: https://www.nhs.uk/conditions/diverticular-disease-and-diverticulitis/ . Sciencedirect.com . (2019). Hartmann Procedure - an overview | ScienceDirect Topics . [online] Available at: https://www.sciencedirect.com/topics/medicine-and-dentistry/hartmann-procedure . Singh, B., May, K., Coltart, I., Moore, N. and Cunningham, C. (2008). The Long-Term Results of Percutaneous Drainage of Diverticular Abscess. The Annals of The Royal College of Surgeons of England , [online] 90(4), pp.297–301. doi: https://doi.org/10.1308/003588408x285928 . Ubhi, L. (2023). Prescribing Analgesia and the WHO Analgesic Ladder | Geeky Medics . [online] geekymedics.com . Available at: https://geekymedics.com/prescribing-analgesia-and-the-who-analgesic-ladder/ . Project Gallery
- The Dual Role of Mitochondria | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The Dual Role of Mitochondria 06/01/25, 13:43 Last updated: Powering life and causing death Mitochondria as mechanisms of apoptosis Mitochondria are famous for being the “powerhouse of cells” and producing ATP for respiration by being the site for the Krebs cycle, the electron transport chain and the location of electron carriers. However, one thing mitochondria are not known for is mediating programmed cell death, or apoptosis. This is a tightly controlled process within a cell to prevent the growth of cancer cells. One way apoptosis occurs is through the mitochondria initiating protein activation in the cytosol (a part of the cytoplasm). Proteins such as cytochrome c activate caspases by binding to them, causing cell death. Caspases are enzymes that degrade cellular components so they can be removed by phagocytes. Mitochondrial apoptosis is also controlled by the B cell lymphoma 2 (BCL-2) family of proteins. They are split into pro-apoptotic and pro-survival proteins, so the correct balance of these two types of BCL-2 proteins is important in cellular life and death. Regulation and initiation of mitochondrial apoptosis Mitochondrial apoptosis can be regulated by the BCL-2 family of proteins. They can be activated due to things such as transcriptional upregulation or post-translational modification. Transcriptional upregulation is when the production of RNA from a gene is increased. Post-translational modification is when chemical groups (such as acetyl groups and methyl groups) are added to proteins after they have been translated from RNA. This can change the structure and interactions of proteins. After one of these processes, BAX and BAK (some examples of pro-apoptotic BCL-2 proteins) are activated. They form pores in the mitochondrial outer membrane in a process called mitochondrial outer membrane permeabilisation (MOMP). This allows pro-apoptotic proteins to be released into the cytosol, leading to apoptosis. Therapeutic uses of mitochondria Dysregulation of mitochondrial apoptosis can lead to many neurological and infectious diseases, such as neurodegenerative diseases and autoimmune disorders, as well as cancer. Therefore, mitochondria can act as important drug targets, providing therapeutic opportunities. Some peptides and proteins are known as mitochondriotoxins or mitocans, and they are able to trigger apoptosis. Their use has been investigated for cancer treatment. One example of a mitochondriotoxin is melittin, the main component in bee venom. This compound works by incorporating into plasma membranes and interfering with the organisation of the bilayer by forming pores, which stops membrane proteins from functioning. Drugs consisting of melittin have been used as treatments for conditions such as rheumatoid arthritis and multiple sclerosis. It has also been investigated as a potential treatment for cancer, and it induced apoptosis in certain types of leukaemia cells. This resulted in the downregulation of BCL-2 proteins, meaning there was decreased expression and activity.The result of the melittin-induced apoptosis is a preclinical finding, and more research is needed for clinical applications. This shows that mechanisms of mitochondrial apoptosis can be harnessed to create novel therapeutics for diseases such as cancer. It is evident that mitochondria are essential for respiration but also involved in apoptosis. Moreover, mitochondria are regulated by the activation of proteins like BCL-2, BAX and BAK. With further research, scientists can develop more targeted and effective drugs to treat various diseases associated with mitochondria. Written by Naoshin Haque Project Gallery
- The world vs the next pandemic | Scientia News
Go Back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The world vs the next pandemic Last updated: 18/11/24 The human race has witnessed ten influenza pandemics over the course of 300 years. COVID-19, the most recent, killed approximately 6.9 million people and infected nearly 757 million. Though seemingly quite large, the number of deaths caused by the coronavirus is still comparatively fewer than the pandemics of the past, which have killed around 50–100 million people globally. These large numbers may seem like statistics from a century ago, but many scientists predict the same-scale destruction with future pandemics, heightening the concern about how prepared we are when the next big outbreak strikes. It is impossible to know when the next pandemic will hit or the number of casualties it will bring. The only certainty is that it cannot be avoided, which raises the question of how to mitigate the impact and reduce the effectiveness of large-scale losses. During the COVID-19 outbreak, we observed that preventive measures such as social distancing and face coverings could intervene in viral transmission to some degree. Additionally, strategies like complete lockdown, isolation and timely treatment can help in the containment and recovery of those already infected. These measures, however, can only be taken once the threat is detected promptly before infecting a larger population. To prevent an infection from becoming an outbreak, strategies that focus on the source of the disease can prove to be highly advantageous. Preventive measures may include: ● monitoring the mobilisation of wildlife that potentially carries harmful pathogens ● studying the interactions between different species in wildlife ● surveillance of the domestic and international markets for wildlife trade and strict imposition of biosecurity laws. Additionally, an effort needs to be made for sharing the generated data with global laboratories to promote scientific collaboration. Once the threat is identified, quick decision-making using the correct precautions needs to take place. Simultaneously, investments in research sectors promoting mRNA vaccine developments, novel drug treatments, and emerging technological advances need to be increased. In conclusion, the strategies for the management of the next pandemic need to operate on a multi-level governance with optimal coordination between different institutions involved in crisis management. There is a constant threat of pandemics looming over the world. The outbreak is inevitable, but its effect solely depends on the preparedness and response of the governmental bodies and global health institutions. Is it going to be a hurricane of destruction, or will it just pass by like a gush of wind? Only time will tell. Written by Navnidhi Sharma Related articles: Diabetes mellitus as an epidemic / Are pandemics becoming less severe? REFERENCES Coccia, M. (2021). Pandemic Prevention: Lessons from COVID-19. Encyclopedia, 1(2), 433–444. https://doi.org/10.3390/encyclopedia1020036 Cockerham, W. C., & Cockerham, G. B. (2021). The COVID-19 reader: the science and what it says about the social. Routledge. Frieden, T. R., Buissonnière, M., & McClelland, A. (2021). The world must prepare now for the next pandemic. BMJ Global Health, 6(3), e005184. https://doi.org/10.1136/bmjgh-2021-005184 Garrett, L. (2005). The Next Pandemic. Foreign Af airs, 84, 3. https://heinonline.org/HOL/LandingPage?handle=hein.journals/fora84&div=61&id=&page= WORLD HEALTH ORGANISATION. (2022). WHO Coronavirus (COVID-19) Dashboard. Covid19.Who. int. https://covid19.who.int/?mapFilter=deaths World Economic Forum. (2021, November 30). COVID-19: How much will it cost to prepare the world for the next pandemic? World Economic Forum. https://www.weforum.org/agenda/2021/11/preparing-for-next-pandemic-covid-19
- A potential treatment for HIV | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link A potential treatment for HIV 28/11/24, 12:09 Last updated: Can CRISPR/Cas9 overcome the challenges posed by current HIV treatments? The human immunodeficiency virus (HIV) was recorded to affect 38.4 million people globally at the end of 2021. This virus attacks the immune system, incapacitating CD4 cells: white blood cells (WBCs) which play a vital role in activating the innate immune system and fighting infection. The normal range of CD4 cells in our body is from 500 to 1500 cells/mm3 of blood; HIV can rapidly deplete the CD4 count to dangerous levels, damaging the immune system and leaving the body highly susceptible to infections. Whilst antiretroviral therapy (ART) can help manage the virus by interfering with viral replication and helping the body manage the viral load, it fails to eliminate the virus altogether. The reason for this is due to the presence of latent viral reservoirs where HIV can lay dormant and reignite infection if ART is stopped. Whilst a cure has not yet been discovered, a promising avenue being explored in the hopes of eradicating HIV has been CRISPR/Cas9 technology. This highly precise gene-editing tool has been shown to have the ability to induce mutations at specific points in the HIV proviral DNA. Guide RNAs pinpoint the desired genome location and Cas9 nuclease enzymes act as molecular scissors that remove selected segments of DNA. Therefore, CRISPR/Cas9 technology provides access to the viral genetic material integrated into the genome of infected cells, allowing researchers to cleave HIV genes from infected cells, clearing latent viral reservoirs. Furthermore, the CRISPR/Cas9 gene-editing tool can also prevent HIV from attacking the CD4 cells in the first place. HIV binds to the chemokine receptor, CCR5, expressed on CD4 cells, in order to enter the WBC. CRISPR/Cas9 can cleave the genes for the CCR5 receptor and therefore preventing the virus from entering and replicating inside CD4 cells. CRISPR/Cas9 technology provides a solution that current antiretroviral therapies cannot solve. Through gene-editing, researchers can dispel the lasting reservoirs unreachable by ART that HIV is able to establish in our bodies. However, further research and clinical trials are still required to fully understand the safety and efficacy of this approach to treating HIV before it can be implemented as a standard treatment. Written by Bisma Butt Related article: Antiretroviral therapy Project Gallery
- Are aliens on Earth? | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Are aliens on Earth? 24/09/24, 10:44 Last updated: Applications of ancient DNA analysis During a recent congressional hearing regarding UFOs held by Mexico, two alleged alien corpses were presented by UFO enthusiast Jaime Maussan. These artefacts were met with scepticism due to Maussan’s previous five claims to have found aliens, all debunked as mummified human remains. To verify the newly found remains as alien, various lab tests have been performed, one being a carbon-14 analysis by researchers at the Autonomous National University of Mexico. This analysis estimated the corpses to be approximately 1000 years old. Determination of the corpses’ genetic make-up is another essential technique for the verification of the supposed alien remains, but is it possible for these ancient remains to undergo DNA analysis? Yes; in fact, there are methods specialised for cases such as these that enable ancient DNA (aDNA) analysis. The relatively recent advent of high throughput sequencing technology has streamlined DNA sequencing into becoming a more rapid and inexpensive process. However, aDNA has fundamental qualities that complicate its analysis such as postmortem damage, extraneous co-extracted DNA and the presence of other contaminants. Therefore, extra steps are essential in the bioinformatics workflow to make sure that the aDNA is sequenced and analysed as accurately as possible. So, let’s talk about the importance of aDNA analysis in various areas and how looking at the genetics of the past, and potentially space, can unearth information for modern research. Applications of aDNA sequencing and analysis Analysis of ancient DNA is a useful technique for the discovery of human migration events from hundreds of centuries ago. For example, analyses of mitochondrial DNA (mtDNA) have repeatedly substantiated the “Recent African Origin” theory of modern human origins; the most common ancestor of human mtDNA was found to exist in Africa about 100,000-200,000 years ago. There have also been other recent studies within phylogeography; an aDNA study on skeletal remains of ancient northwestern Europeans carried out in 2022 showed that mediaeval society in England was likely the result of mass migration across the North Sea from the Netherlands, Germany and Denmark. Thus, these phylogeographic discoveries improve our knowledge of the historic evolution and migration of human populations. Paleopathology, the study of disease in antiquity, is another area for which ancient DNA analysis is important. Analysis of DNA from the victims of the Plague of Justinian and the Black Death facilitated the identification of Yersinia Pestis and determined it as the causal agent in these pandemics. The contribution of aDNA analysis is consequently important to reveal how diseases have affected past populations and this derived genetic information can be used to identify their prevalence in modern society. Exciting yet debatably ethical plans for the de-extinction of species have also been announced. The biotech company Colossal announced plans in 2021 to resurrect the woolly mammoth among other species such as the Tasmanian tiger and the dodo bird. Other groups plan to resurrect the Christmas Island rat and Steller’s sea cow. In theory, this is exciting, or scary from certain ecological perspectives, but is complicated in practice. Even though the number of nuclear genomes sequenced from extinct species exceeds 20, there has been no restoration of species to date. Are aliens on Earth? Thus, ancient DNA analysis can be applied to a multitude of areas to give historical information that we are able to carry into the modern world. But, finally, are these ‘alien’ corpses legitimately from outer space? José Zalce Benitez is the director of the Health Sciences Research Institute in the secretary of the Mexican Navy’s office and he reports on the scientists’ findings. The DNA tests were allegedly compared with over one million species and found not to be genetically related to “what is known or described up to this moment by science.” In essence, genetic testing has not conflicted with Maussan’s claim that these remains are alien so the possibility of their alien identity cannot yet be dismissed. However, this genetic testing does not appear to be peer-reviewed; NASA is reportedly interested in the DNA analysis of these corpses, so we await further findings. Ancient DNA analysis will undoubtedly provide intriguing information about life from outer space or, alternatively, how this DNA code was faked. Whatever the outcome, ancient DNA analysis remains an exciting area of research about life preceding us. Written by Isobel Cunningham Project Gallery
- Bone cancer | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Bone cancer 03/06/24, 15:01 Last updated: Pathology and emerging therapeutics Introduction: what is bone cancer? Primary bone cancer can originate in any b one. However, most cases develop in the long bones of the legs or upper arms. Each year, approximately 550 new cases are diagnosed in the United Kingdom. Primary bone cancer is distinct from secondary bone cancer, which occurs when cancer spreads to the bones from another region of the body. The focus of this article is on primary bone cancer. There are several types of bone cancer: osteosarcoma, Ewing sarcoma, and chondrosarcoma. Osteosarcoma originates in the osteoblasts that form bone. It is most common in children and teens, with the majority of cases occurring between the ages of 10 and 30. Ewing (pronounced as YOO-ing) sarcoma develops in bones or the soft tissues around the bones. Like osteosarcoma, this cancer type is more common in children and teenagers. Chondrosarcoma occurs in the chondrocytes that form the cartilage. Chondrosarcoma is most common in adults between the ages of 30 and 70 and is rare in the under-21 age group. Causes of bone cancer include genetic factors such as inherited mutations and syndromes, and environmental factors such as previous radiation exposure. Treatment will often depend on the type of bone cancer, as the specific pathogenesis of each case is unknown. What is the standard treatment for bone cancer? Most patients are treated with a combination of surgical excision, chemotherapy, and radiation therapy. Surgical excision is employed to remove the cancerous bone. Typically, it is possible to repair or replace the bone, although amputation is sometimes required. Chemotherapy involves using powerful chemicals to kill rapidly growing cells in the body. It is widely used for osteosarcoma and Ewing sarcoma but less commonly used for chondrosarcomas. Radiation therapy (also termed radiotherapy) uses high doses of radiation to damage the DNA of cancer cells, leading to the killing of cancer cells or slowed growth. Six out of every ten individuals with bone cancer will survive for at least five years after their diagnosis, and many of these will be completely cured. However, these treatments have limitations in terms of effectiveness and side effects. The limitation of surgical excision is the inability to eradicate microscopic cancer cells around the edges of the tumour. Additionally, the patient must be able to withstand the surgery and anaesthesia. Chemotherapy can harm the bone marrow, which produces new blood cells, leading to low blood cell counts and an increased risk of infection due to a shortage of white blood cells. Moreover, radiation therapy uses high doses of radiation, resulting in the damage of nearby healthy tissues such as nerves and blood vessels. Taken together, this underscores the need for a therapeutic approach that is non-invasive, bone cancer-specific, and with limited side effects. miR-140 and tRF-GlyTCC Dr Darrell Green and colleagues investigated the role of small RNAs (sRNAs) in bone cancer and its progression. Through the analysis of patient chondrosarcoma samples, the researchers identified two sRNA candidates associated with overall patient survival: miR-140 and tRF-GlyTCC. MiR-140 was suggested to inhibit RUNX2, a gene upregulated in high-grade tumours. Simultaneously, tRF-GlyTCC was demonstrated to inhibit RUNX2 expression by displacing YBX1, a multifunctional protein with various roles in cellular processes. Interestingly, the researchers found that tRF-GlyTCC was attenuated during chondrosarcoma progression, indicating its potential involvement in disease advancement. Furthermore, since RUNX2 has been shown to drive bone cancer progression, the identified miR-140 and tRF-GlyTCC present themselves as promising therapeutic targets. CADD522 Dr Darrell Green and colleagues subsequently investigated the impact of a novel therapeutic agent, CADD522, designed to target RUNX2. In vitro experiments have revealed that CADD522 reduced proliferation in chondrosarcoma and osteosarcoma. However, a bimodal effect was observed in Ewing sarcoma, indicating that lower levels of CADD522 promoted sarcoma proliferation, whereas higher levels of the same drug suppressed proliferation. In mouse models treated with CADD522, there was a significant reduction in cancer volumes observed in both osteosarcoma and Ewing sarcoma. Take-home message The results described here contribute to understanding the molecular mechanisms involved in bone cancer. They highlight the anti-proliferative and anti-tumoral effects of CADD522 in treating osteosarcoma and Ewing sarcoma. Further research is necessary to fully elucidate the specific molecular mechanism of CADD522 in bone cancer and to identify potential side effects. By Favour Felix-Ilemhenbhio Related article: Secondary bone cancer Project Gallery
- Motivating the Mind | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Motivating the Mind 18/07/24, 10:35 Last updated: MIT scientists found reward sensitivity varies by socioeconomic status Behaviour is believed by many, including the famous psychologist B.F. Skinner, to be reinforced by rewards and the degree to which an individual is motivated by rewards is called reward sensitivity. Another common view is that behaviour is influenced by the environment, nowadays including socioeconomic status (SES). People with low SES encounter fewer rewards in their environment, which could affect their behaviour toward pursuing rewards due to their scarcity- Farah 2017. Thus, a study by Decker (2024) investigates the effect of low SES on reward sensitivity in adolescents through a gambling task, using fMRI technology to measure response times, choices and activity in the striatum – the reward centre of the brain. The researchers hypothesised that response times to immediate rewards, average reward rates and striatal activity would differ for participants from high compared to low SES backgrounds. See Figure 1 . The study involved 114 adolescents whose SES was measured using parental education and income. The participants partook in a gambling task involving guessing if numbers were higher or lower than 5, the outcomes of which were pre-determined to create blocks with reward abundance and reward scarcity. Low and high SES background teenagers gave faster responses and switched guesses when the rewards were given more often. Also, immediate rewards made the participants repeat prior choices and slowed response times. In line with the hypothesis, fewer adolescents with lower SES slowed down after rare rewards. Moreover, it was found that lower SES is linked with fewer differences between reward and loss activation in the striatum, indicating experience-based plasticity in the brain. See Figure 2 . Therefore, the research by Decker (2024) has numerous implications for the real world. As adolescents with lower SES displayed reduced behavioural and neural responses to rewards and, according to behaviourism, rewards are essential to learning, attention and motivation, it can be assumed that SES plays a role in the inequality in many cognitive abilities. This critically impacts the understanding of socioeconomic differences in academic achievement, decision-making and emotional well-being, especially if we consider that differences in SES contribute to prejudice based on ingroups and outgroups. Interventions to enhance motivation and engagement with rewarding activities could help buffer against the detrimental impacts of low SES environments on cognitive and mental health outcomes. Overall, this research highlights the need to address systemic inequities that limit exposure to enriching experiences and opportunities during formative developmental periods. Written by Aleksandra Lib Related article: A perspective on well-being REFERENCES Decker, A. L., Meisler, S. L., Hubbard, N. A., Bauer, C. C., Leonard, J., Grotzinger, H., Giebler M. A., Torres Y C., Imhof A., Romeo R. & Gabrieli, J. D. (2024). Striatal and Behavioral Responses to Reward Vary by Socioeconomic Status in Adolescents. The Journal of Neuroscience: the Official Journal of the Society for Neuroscience, 44(11). Farah, M. J. (2017). The neuroscience of socioeconomic status: Correlates, causes, and consequences. Neuron, 96(1), 56-71. Project Gallery