By Alice Sardarian
In 1999, James Harrison was awarded the Medal of the Order of Australia for saving millions of children’s lives through his blood plasma donations. Harrison, also known as the “man with the golden arm,” has donated plasma almost every week for the past 60 years, spanning the entire range of donor ages permitted by the Australian Red Cross Blood Service. Plasma is a component of blood that contains nutrients, ions, proteins, and more, not including platelets, leukocytes, and erythrocytes. Plasma donation involves continuously withdrawing blood, separating the red blood cells from the plasma, and then returning the red blood cells back to the donor. According to the Red Cross, plasma may be donated much more frequently than whole blood: donors must wait only 7 days for plasma as opposed to between 8 and 16 weeks for whole blood.
By Elifsu Gencer
The joy I feel when I spot a perfect little snowflake on my coat sleeve is truly unmatched. Having been taught that every snowflake is unique, I marvel at the level of detail I can see with my bare eyes on something so small in size. I can’t help but think, though, that it does kind of look like every other snowflake that I’ve examined. How do we know that each snowflake is actually unique? And what does “unique” really mean?
By Elaine Zhu
Illustration by Lizka Vaintrob
For the average college student, each week seems to be a battle in trying to balance a boatload of classes, homework, a social life, and other extracurricular activities. With the amount of activities students need to manage, it is no surprise that according to an assessment completed by the American College Health Association, 63% of college students felt overwhelming anxiety in 2018. This trend does not end after college either—there are some 40 million adults in the United States who have an anxiety disorder. Along with these responsibilities, sleeping the full, recommended eight hours a night is also something that is essential to a student’s schedule. However, the Center for Disease Control estimates that at least 35% of adults, including college students, are also not getting enough sleep at night. Research has shown that a lack of sleep and anxiety disorders often go hand in hand. Sleep disruption is linked with the development of anxiety disorders and is also present in post-traumatic stress disorder, panic disorder, and social anxiety disorder.
Learning From Mistakes: How The Decoupling Between Confidence and Action in Patients With OCD Affects The Decision-Making Process
By Victoria Comunale
If one is not confident in their knowledge before a test, they usually study. This trend—that one’s confidence and one’s actions are often linked—is true for countless other scenarios concerning decision-making and learning. Someone with obsessive-compulsive disorder, OCD, however, may find themselves in striking defiance of the trend, often to their detriment. Such scenarios can range from washing their hands countless times even though they know it isn’t improving their hygiene to checking in on loved ones excessively to ensure their safety despite countless reassurances. The compulsive actions of people affected by OCD are disproportionate, especially concerning their apparent confidence level, leading scientists to postulate that such individuals’ actions and confidence may not be linked during learning.
In 2017, researchers at the University of Cambridge published a study that revealed a dissociation between the actions and confidence of patients with OCD. In the study, researchers designed an experiment to test how one’s confidence affects one’s actions in a decision-making process by assessing participants’ performance in an online game. The game involved asking participants to use a bucket to catch particles that came from a central source. The bucket was placed along a ring around the source, and the particles would be found in a general area for a period of time before moving to another area on the ring. The position of a particle generally followed a pattern, but once in a while a particle would appear in a location that deviated from the general pattern.
The method of assessing the correlation of confidence and action relies upon the difference in responses to the prediction error associated with these randomly placed particles. Prediction error refers to a mistake made in regard to the occurrence of an event based on previous data. Here, it would be the error of placing the bucket in a wrong location, based on the general pattern, when the random particles would be placed.
Returning to the correlation of confidence and actions, when one is uncertain about a situation, new information can be very influential. In contrast, if someone is very confident, they are less influenced by new information. Researchers found that patients with OCD were more likely to place the bucket where the last particle landed in comparison to the control group—in other words, they overreacted to their prediction error. The groups did not differ significantly in how many particles were caught, but OCD patients’ prediction error prompted them to overcompensate in the subsequent placement of the bucket. Such patients moved the bucket in a way that overemphasized the prediction error, rather than following the general trend. Participants in the control group did not pay significant attention to such changes unless a substantial shift took place in the general trend of where the particles continued to land.
An interesting outcome was that confidence ratings were very similar between patients with OCD and with the control group. This suggests that both groups developed a sense of what was going on, yet participants with OCD did not use that knowledge to inform their bucket placement. The computed learning rates of the trial indicated that in extreme cases, some OCD patients disregarded the accumulated information and updated the position of the bucket in accordance with the most recent outcome.
Ultimately, it was found that the degree of decoupling between confidence and action is correlated with the severity of OCD in patients, which suggests that decoupling is an underlying factor behind the irrational behavior that characterizes OCD. In the future, studies like this may help improve our understanding of human learning for those with OCD and thus better support individuals with the disorder.
By Elaine Zhu
Australia has been in the global spotlight after bushfires ravaged millions of acres of land since early 2020, and the situation only seems to be getting worse. Long periods of severe drought and temperatures up to 105.6 degrees Fahrenheit have exacerbated the bushfires. Not only have more than eleven million acres of forest and parks been damaged, but the fires have also killed at least thirty-three people and almost five hundred million animals, causing exorbitant amounts of damage to Australian ecosystems.
Wildfires are not an uncommon occurrence in Australia. They usually occur every autumn, burning millions of acres every year. In fact, some species of Australian native plants rely on these fires to regenerate and Australians often use these types of fires for land management and agriculture. Bushfires typically start because of a variety of factors, including the presence of forests, greenery, shrubs, and oxygen, but the speed and intensity of the fire can be influenced by other factors. These factors can include temperature, soil condition, moisture levels, and wind speed. However, the Australian bushfires are on a completely different level compared to previous isolated wildfire events. The Australian bushfires have burned almost eight times more land than the 2018 fires in California—the worst in California’s history.
So what exactly is causing these increasingly dangerous wildfires? To many, the Australian fires are a clear indication of the relationship between climate change and worsening extreme weather patterns. Research from the 2017 Climate Science Special Report has shown that decreases in soil moisture content from higher temperatures have been traced back to human actions and influences and will only make the risk of fires worse. Stepfan Rahmstorf, the main author on the United Nations’ Intergovernmental Panel on Climate Change’s Fourth Assessment Report, states that “due to enhanced evaporation in warmer temperatures, the vegetation and the soils dry out more quickly, so even if the rainfall didn’t change, just the warming in itself would already cause a drying of vegetation and therefore increased fire risk.” As the climate becomes warmer and creates drier conditions, it is much easier for fires to start; these dry conditions only help the fire spread to other areas. This drying trend also affects the amount of rainfall that Australia receives. The Climate Council released a press briefing explaining that the regions of Tenterfield and Stanthorpe in Australia had 77 percent less rainfall than the usual annual average. Overall, the southeastern regions of Australia have had a 15 percent decline in the rainfall during the autumn months and a 25 percent decline in the rainfall in the months of April and May. Rainfall is especially important during the bushfire seasons, as lower levels of rainfall will lengthen the duration of the bushfires.
These bushfires haven’t just caused detrimental effects to the people and animals of Australia, but have also had lasting effects that will contribute to the global climate change problem—one of the factors that exacerbated the Australian bushfires in the first place—and thus create a positive feedback loop. Mark Parrington, senior scientist for the European Centre for Medium-Range Weather Forecasts, states that from September to January 2019, “the wildfires released around 400 million tons of CO2, which is roughly the same amount the the United Kingdom emits in an entire year.” The acceleration of carbon dioxide in the atmosphere will only exacerbate climate change and its global effects, including the melting rate of glaciers. The soot from the fires can latch onto the glaciers, creating an almost brown caramel color on the surface. This change in color decreases the reflectivity of the ice, which makes the glaciers melt even faster, increasing the ocean temperature and decreasing the amount of sunlight reflected back into the atmosphere.
With the detrimental effects that the bushfires have had on Australia, it wouldn’t be too far of a leap to assume that Australian leadership might take action to reduce climate change and prevent more bushfires from happening. However, the Australian Prime Minister, Scott Morrison, has had a history of opposing action to reduce carbon emissions and has downplayed the effects of climate change on the intensity of the Australian bushfires. In the past, he has spoken out against taxing carbon emissions and has promoted and protected coal mining. The Australian bushfires are just one of the consequences of anthropogenic effects on climate change and the environment. In order to decrease the severity of bushfires and the rise of global climate change, we must recognize the harmful effects that climate change has caused and will continue causing to future generations. We must take action to reduce carbon emissions worldwide.
By Clare Nimura
A hundred years ago, you would not have much hope of survival if one of your organs suddenly stopped working. Today, however, there exists a vast system of organ donation. Surgeons perform complex medical procedures to remove healthy organs from willing donors and to replace missing or damaged ones in recipients. Problem solved! But not quite….
There are currently more than 113,000 people on the national waiting list for organ transplants, with a new person added to the list every 10 minutes. These patients have diseases like cardiomyopathy, diabetes, cystic fibrosis, or cirrhosis and are in need of working replacements for their kidneys, livers, hearts, lungs, or other vital organs. The number of people in need of transplants greatly outweighs the number of transplants performed each year (less than 40,000), and about 20 people die each day while waiting for a new organ. How did such a large discrepancy arise? The answer is twofold: under-registration of donors and a broken donation system.
On the surface, organ donation seems simple—just replace a dysfunctional organ with a healthy one—but, in reality, there are many more variables at play. Most people are unaware that only 3 out of 1000 people die in a way that leaves their organs viable for donation, or that a single donor can save up to eight lives if they are able to donate their heart, lungs, liver, pancreas, kidneys, and intestines. If you do the math, this does not yield nearly enough viable organs for the thousands on the waiting list. Additionally, though 95% of adults in the United States support organ donation, only 58% are actually registered. These dismal statistics are compounded with a highly corrupt and ineffective system for organ recovery, which wastes thousands of viable organs daily. How did this critical system end up in such a terrible condition?
In a perfect world, an organ donation would be orchestrated as follows: a potential donor would be identified, proper consent would be obtained, and the organ would be harvested and stored in sterile packaging to be transported to the recipient’s transplant hospital, where the next patient on the waiting list would finally get their new organ. In reality, this is far from the sequence of events. Not only are there channels by which the very wealthy can get added to multiple waiting lists, there is corruption in the organ donation process itself, which results in tens of thousands of healthy organs left untouched.
Organ Procurement Organizations (OPOs) are, by federal law, the only group that can recover organs from deceased donors for transplantation. There are 58 such groups in the United States; they are non-profit contractors who are responsible for coordinating donations. The problem is that not only do these groups hold monopolies over their designated areas of service, they also follow an internal evaluation system that does not incentivize the pursuit of every viable organ. One recent study demonstrated that with reforms to increase efficiency and effectiveness of the organ donation system, there is the potential to recover up to 28,000 more organs per year and to save billions of dollars in the process. When OPOs fail to show up, people die.
How can we heal this broken system? First, you can sign up to be an organ donor at https://www.organdonor.gov/register.html. And second, you can encourage your congressional representatives to push for transparency in OPO metrics and for improved accountability. Removing perverse incentives and initiating external audits are two potential improvements. At this moment, there are viable lungs, hearts, livers, and other organs ready for transplant, and there is a sick patient somewhere who has been waiting for those organs, maybe for years, but may never get them because of the corruption in the organ donation process. This should not be the case, and there are simple solutions that would improve the situation greatly. Every small improvement is worth it; every organ recovered is another life potentially saved.
By Vivian Liu
The software revolution has taken the world by storm. Many things that we see or use in our everyday lives are automated: from data processing to virtual machine assistants, computer programs are helping us complete tasks that would otherwise be extremely resource consuming.
Questions that would have taken significant human capital can be pipelined to a computer program or inputted to a function for a quick output. Software is particularly helpful in allowing us to answer large numbers of objective questions en masse. For example, TurboTax is a well-established computer program that combines user interface and computation to help humans calculate their taxes more efficiently.
Although such advances have clearly benefited society, it is also important to take a step back to scrutinize the demographics of the programmers behind the code. The technology industries suffer from an acute lack of diversity: only 10% of researchers in Artificial Intelligence (AI) at Facebook are women, and 4.5% of workers at Google are African-American. Currently, women hold only 25% of all jobs in tech in Silicon Valley, and this number has been on the decline over the last few years. This issue also extends to the degrees awarded each year: according to a study conducted by the U.S. Equal Employment Opportunity Commission, each year men receive at least 70% of all degrees in computer science, mathematics, and engineering. In the 1980’s, women earned 37% of all computer science degrees, but today, that number has dropped down to 18%.
There are many reasons for such a stark disparity in gender and racial diversity in the technology sector. Part of the inequality stems from unequal education opportunities: from a young age, women and minority groups generally get less exposure to such fields, and are therefore in an unfair position to compete with those who regularly participate in and learn about STEM-related activities. Even those who do receive degrees in tech-related fields are discouraged to continue pursuing a career in a field that is so largely male and white-dominated. In addition, women in the technology industry are just as victimized by the “80 cents to the dollar” inequity that permeates the American workplace.
This lack of ethnic and gender diversity in AI causes issues that augment the marginalization of minority groups. Due to sexism in the workplace, which is as pervasive as this technology is, there is a significant chunk of the population being left out of the development and progress of this field. These disparities in demographic have immense implications in Artificial Intelligence, a huge field of computer science which has been steadily growing in presence.
In particular, one of the tasks that can be handled by AI is asking a machine a subjective question. Whereas a tax calculation system can punch some numbers into some objective function and get an instantaneous answer, a machine trying to answer a subjective question must learn to simulate how a human being would answer a question. This requires AI, which involves writing code that can be trained through data processing to perform tasks that are not as easy as a simple calculation.
For example, there is no easy mathematical function that can allow a machine to perform the subjective job of an application reviewer. Instead of a straightforward operation, the goal of a programmer would be to create a machine to simulate the reviewer as closely as possible. By inputting data consisting of a human application reviewer’s decisions under different conditions, a machine can be taught to model a human’s actions in reviewing an application.
Amazon famously set about creating such a program in 2014. Programmers gathered data from past hires—given different components of a resume (university degrees, GPA, experiences, and so on), a human reviewer would pipe their hiring result into a machine learning program that created a complex mathematical model that outputted a score based on the applicant’s resume. In principle, this program would provide a score to an applicant based on the strength of an application so that human reviewers can automatically screen applications below a certain threshold score and give special consideration to applications with scores above some threshold. However, what people found was that implicit biases in the human reviewer data that the programmers used in their model tended to discriminate against women—the most alarming of the indicators was that an occurence of “women’s college” in the applicant’s resume would automatically result in a score reduction. In fact, anything involving “women,” like the phrase “women’s chess club” would result in a score reduction. This is an especially concerning occurence of data that reflects harmful biases being implemented and magnified in AI applications.
In addition to this application reviewing program, another example of this problem in AI is in facial recognition AI software. Accordingly, for the same computer program, facial recognition works better for white males than for any dark-skinned individuals. The difference in facial recognition accuracy between fair-skinned and dark-skinned individuals does not necessarily come from bad intentions, but it is an unfortunate byproduct of a skewed set of programmers, and therefore, a skewed set of training data and code.
People are recognizing the magnitude of the issue and many are fighting to remedy this problem. An example is Stanford AI Professor Fei-Fei Li’s efforts to encourage young women to pursue AI-related careers—in 2015, she founded a summer program called “AI4ALL” which draws girls from all over the world to study computer science at Stanford University. In addition, outreach programs such as STEM Starters at Columbia University are connecting with schools in the inner-city to spark interest among students in STEM, helping to equip the next generation to overcome the imbalance in the tech industry.
This is AI’s diversity crisis. The computer programs that help us with our everyday lives have real programmers behind them, and this community suffers from an acute lack of diversity. Extremely sensitive software such as surveillance and facial analysis are currently created by predominantly male and people who are not of color. Without a diversification of perspectives behind computer code, the same biases and inequities that affect society today will be magnified and perpetuated by computer programs. As AI continues to advance and become a larger presence in our lives, continuing this current trend of AI diversity will have dire consequences for society.
By Victoria Comunale
Sleep and memory-formation share a fickle relationship. We are told to get plenty of rest before a test because sleep plays a crucial role in learning by aiding memory consolidation. Yet, sometimes, after a night of rest, we can completely forget about something that was on our minds the night before. Sleep’s role in memory consolidation and memory loss has been explored by neuroscientists before, but the balance between these two dueling occurrences, as well as the role of the brain waves in this phenomenon, still remain somewhat of a mystery.
Researchers at the University of California San Francisco (UCSF), recently published a study examining the role of different kinds of brain waves in memory consolidation and forgetfulness in rats. In this experiment, the researchers trained rats to control a feeding tube. This control was a skill that was gradually learned by the rat, and memory consolidation played a crucial role in this process. Successfully completing the task required the rat to move the tube from point A to point B within 15 seconds. Since this learning process involved the motor cortex, the researchers studied the brain waves in this area of the brain during non-REM sleep.
There are several stages of non-REM sleep that can be differentiated by distinct patterns of brain wave activity. These stages are arranged from lightest to deepest sleep, and the waves themselves arise from the neural activity of a region of the brain known as the thalamus. The first stage of non-REM sleep, the lightest stage, is characterized by both alpha and theta waves. In stage two, theta waves dominate brain activity but are interrupted by brief bursts of higher frequency brain waves known as sleep spindles. The third and fourth stages of non-REM sleep, the deepest stages, also feature sleep spindles, but are predominantly characterized by delta waves and slow oscillation waves, which researchers were most interested in. In order to study the effects of these waves, they employed a recent technique that has grown in popularity in the field of neuroscience: optogenetics. Utilizing this technique, the researchers were able to directly interfere with the activity of neurons in the brain and interrupt the activity of the targeted brain waves, thus establishing a causal rather than correlational relationship.
The role of slow oscillation waves has long been suspected of playing a role in memory consolidation, yet the function of delta waves, which are more prevalent than slow oscillation waves, is still unknown. When the researchers used optogenetics to interfere with the slow oscillation waves, the success rate of the rats in the tube moving task was much lower than that of the control group. Yet, interfering with delta waves had the opposite effect. The rats were able to complete the task with a higher success rate than the control group. From these remarkable results, the researchers concluded that these dueling brain waves have opposite effects during sleep. This conclusion was unexpected, since these brain waves are found in the same stage. Therefore, memories both strengthened and weakened during non-REM sleep. The difference in memory consolidation between delta waves and slow oscillations is stark and undeniable, contradicting prior speculations that they may have similar functions.
The researchers also focused on specific bursts of activity associated with slow wave oscillations called sleep spindles. These spindles are already known to play an essential role in sensory processing and long term memory consolidation. Because delta waves, which are associated with memory loss, are more prevalent during non-REM sleep, the researchers postulated that these spindles, coupled with the effects of slow oscillating waves, aid in balancing memory consolidation against the memory loss associated with delta waves.
These results can also have implications for phenomena observed in humans, especially aging. It has been experimentally demonstrated that slow-wave activity has lower amplitude in the elderly. If the results from the UCSF group can be translated to the human brain, this means that the elderly are less likely to have consolidation benefits during sleep because of a reduction in slow oscillations and spindles. It is still unclear if this newfound information concerning the dueling brain waves can be applied as a useful tool in strengthening human memories, considering that the weakening of less important memories is necessary for memories of higher importance to be strengthened. Nevertheless, these results help bring us one step closer in understanding the complicated activities of our brains.
By Dapo Lapite
Illustration by Lizka Vaintrob
The Roman empire, the Mayan Empire, and the Chinese empires. Every single one of these civilizations succumbed to the spread of pathogens, and in today’s world, there is a chance of this disaster repeating. The changing climate and highly advanced modes of international transportation have led to a spread of mosquitoes, ticks, and other organisms carrying dangerous pathogens. Due to the potential spread of an array of diseases, the United States must fund research aimed at finding vaccines, cures, and other medical advances in order to prevent a medical crisis.
It is imperative to examine the history of plagues in order to understand how diseases have impacted civilizations in the past. On average, somewhere in the world, a new infectious disease has “emerged every year for the past 30 years.” Recently, the Ebola crisis indicated what could recur. Ebola became a heavily discussed topic in 1976 when a new illness emerged in Yambuku, in the Democratic Republic of the Congo. At this time, Jean-Jacques Muyembe was the only virologist in the Congo. Muyembe shipped blood samples to the Centers for Disease Control and Prevention in Atlanta, where the scientists then identified the virus. Ebola is a disease that can kill “not just the very young, old, and sick,” but also the strong and fit, by triggering a violent immune response. In 2014, Ebola truly caused mass chaos as hospitals ran out of beds, cities, and coffins.
The best example of the United States’ lack of preparation is the Asian longhorned tick. The Asian longhorned tick is the first invasive tick to spread to the United States in around 80 years. It is native to China, Japan, Russia, and the Korean Peninsula and has also found its way to Australia and New Zealand. In Asia, the tick carries a virus that causes human hemorrhagic fever, which kills around 30 percent of its victims. In 2013, South Korea reported 36 cases and 17 fatalities.
This particular virus is not found in the United States, but it is extremely similar to the Heartland Virus, another life-threatening tick-borne disease cycling through the United States. Diseases spread by ticks are typically underreported. As a result, there are no proven measures that can be used to control several vector-borne diseases transferred by the black-legged tick, which spreads at least seven human pathogens in the United States, including bacteria that cause Lyme disease.
Climate change is increasingly becoming a problem because it is a factor in the emergence of infectious diseases. The warming temperatures make the environment in the United States more hospitable for the ticks and other vectors. And according to the Baylor College of Medicine, as climates warm and habitats are altered, diseases can spread into unforeseen geographic areas. Both ticks and mosquitos are prime examples of species that have expanded their range into regions where they have not been seen. Illness from mosquito, tick, and flea bites “more than tripled” in the United States from 2004 to 2016.
Other parts of the world also fall victim to these diseases through the high amounts of travel taking place. For example, the chikungunya disease, an insect-borne disease, was previously confined to tropical regions around the Indian Ocean. Yet there have been several cases of chikungunya disease imported into the United States by international travelers, including one case in Louisiana. Similarly, Severe Acute Respiratory Syndrome (SARS) first appeared in China in 2002 and quickly spread to other countries near China, and it made it as far as Canada because of air travel. Ultimately, SARS infected 8000 people and killed 800 people before an unprecedented global response halted the disease. The major underlying causes of the increase in vector-borne diseases are growing travel, trade, urbanization, population growth, and increasing temperature. The United States is prone to be shortsighted and forgetful when it comes to the influx of diseases and this trend has only continued in recent years.
Currently, if a massive spread of diseases hit the United States, mass panic would ensue due to the lack of preparation and funding. There was a development of vaccines and antimicrobial drugs throughout the last decade that created hope that infectious diseases could be controlled, but there was a realization that infectious diseases continue to emerge and re-emerge. This provides challenges for infectious disease research. At first, it seemed that the United States would aim at acting proactively when they committed one billion dollars to the effort in 2014. But that now looks uncertain; President Trump’s budget for 2019 cut 67 percent from the current annual funding. With less funding, the CDC will be forced to withdraw from several countries, resulting in a loss of jobs and a need for vital medical knowledge in these regions. With all the evidence pointing towards another plague, it is imperative that the United States begin to reinvest in the fight against potential diseases.
By Elaine Zhu
Selfishness is the act of not caring about others, only thinking about getting ahead, and profiting at the expense of others. While most people think of selfishness as an acquired human trait, research has shown that some human genes can also act in a selfish or even parasitic manner in order to increase the gene’s chances of passing on its genetic material to its offspring. These parasitic genes don’t benefit the body’s overall fitness but instead methodically increase their own chances of transmission. Recent studies have investigated the mechanisms used by these selfish and parasitic genes and their potential applications.
In a study conducted by Nicole Nuckolls, María Angélica Bravo Núñez, and Sarah Zanders, a gene called wtf4 in Schizosaccharomyces kambucha fission yeast was identified as one of these selfish genes. The researchers discovered that the wtf4 gene in yeast actually acts simultaneously as a poison and an antidote. During fission, the wtf4 gene cleverly produces a specifically timed molecular poison during meiosis that is spread to all growing gametes, both the gametes that inherited the wtf4 gene and the ones that didn’t. Before the walls of the spores have formed, the molecular poison spreads to every offspring of the cell.
However, the cells with the wtf4 gene also carry the antidote to the poison it created. The antidote for the poison is made in the later stages of cell development and division after the spore walls have completely formed. Therefore, the gametes that did inherit the gene are guarded against the effects of the poison, but the cells that do not have the gene are left to suffer with the poison and eventually die off. Thus, the gametes that did not inherit wtf4 are left with no protection from the poison and are vulnerable, thereby selecting for the wtf4 gene. By coloring the proteins, Dr. Zanders and her colleagues found the two specific RNA message molecules that the wtf4 gene uses to encode for the poison and antidote. After imaging the cells while they underwent meiosis, the scientists were able to clearly confirm that the wtf4 poison was spread in every cell, but the antidote was only in the spores with the wtf4 gene.
Further research is currently being conducted to identify more selfish genes with the hope that the mechanisms for these genes can be applied in other scientific disciplines. These selfish genes can also provide insight into human infertility, since the “cheating” methods that such genes use can bias natural selection and even directly cause infertility. That is, a selfish gene could produce spores with an incorrect number of chromosomes, which can actually be detrimental to the survival of the daughter cells. Research has shown that chromosomal abnormalities are one of the leading explanations for why miscarriages happen in humans. In an interview conducted by the National Institute of General Medical Sciences, Dr. Zanders stated that “learning general principles about selfish genes in simple models will guide future searches for selfish genes that could be contributing to human infertility.”
Another exciting potential application of these selfish genes is to gene drives. Gene drives are a type of genetic engineering technology that can spread a desired set of genes to a population by increasing their probability of being inherited. These selfish gene mechanisms can potentially lead to the creation of a type of gene drive that may curb or even eradicate problematic insect populations, such as malaria or dengue fever transmitting mosquitoes. These selfish genes use a variety of methods to overpower other genes, and scientists may one day be able to utilize these mechanisms to ultimately improve the quality of human life.