It is an inescapable fact that the world runs on data. Essentially, we are data. In fact, we would not exist without minuscule but complex units of data that form our very core: the two sets of code consisting of 3 billion letters (or base pairs), i.e., our DNA. Being made of 3 billion components makes this data incredibly delicate, meaning maneuvering the DNA is close to impossible. Even the smallest of changes, like a point mutation, might cause cells to behave in dangerous ways, causing genetic diseases such as sickle-cell disease (SCD) and progeria. Fortunately, the advancement of technology has given us a new method to make these minute but massive changes with accuracy – gene editing.
Gene editing allows scientists to be able to alter the DNA of an organism, essentially altering the genetic makeup of the organism. Due to the intricate nature of the code, this engineering is usually extremely slow and expensive. Developments such as the CRISPR-Cas9 tool – which can identify and cut DNA at targeted locations – helped make this process swifter, cheaper, and more precise.
The primary aim of genetic editing is to be able to treat and prevent simple single-gene diseases like hemophilia and SCD as well as more complicated ones like AIDS and cancer; naturally, further developments will expand the areas of application for this group of technologies.
While the concept of gene editing has been around since the 1950s, it was not until quite recently that significant progress, which allows for practical application, was made. Even then, it was mostly used on organisms such as fungi, mice, and rabbits, as the ethical implications of practicing on humans is a debate that is ongoing even now. This moral restriction was triggered by concern about the side effects. These genomic damages can have irreparable pathogenic consequences like cancer.1 This is where artificial intelligence (AI) and machine learning (ML) come in.
Data-driven modeling of biological systems presents opportunities for revolutionary advances in biotechnology. A diverse variety of AI strategies can aid in the advancement of synthetic biology and the achievement of the aforementioned objectives. For instance, researchers at the Wellcome Sanger Institute announced the use of ML to predict which mutations the cell will introduce once CRISPR cuts the DNA strand.2 Data was collected by analyzing over a billion different gene-cutting scenarios with CRISPR and fed to an ML tool named FORECasT. This resource could predict mutations from gene editing, thus cutting costs and reducing the time needed for gene editing. Scientists at MIT have also developed an ML model that predicts insertions/deletions in the cell lines of human diseases.1 ML applications, in tandem with Cas13,3 can help screen for multiple pathogens at once, including the 169 known human-associated viruses,4 thus improving point-of-care diagnostics. In fact, scientists at the University of California Berkeley and the Gladstone Institute of Virology in San Francisco demonstrated rapid COVID-19 detection using an amplification-free CRISPR-Cas13 test and a mobile camera last year.
From combating HIV infection5 to eliminating antibiotic-resistant bacteria,6 the impact of this relatively young field has been tremendous. Genomic editing is revolutionizing medicine by allowing doctors to transition from a one-size-fits-all method of treatment to targeting cures7 based on an individual’s own genetic make-up. Modern data connectivity and AI algorithms have the unique capability to provide a real-time and dynamic picture of bio-based events. In addition to pathology research and disease eradication, gene editing can also choose certain genes to be eliminated or passed along – rewriting the rules of inheritance. For instance, the population of malaria-spreading Anopheles mosquitos can be suppressed using genomic editing.8 The beneficial applications of gene editing go beyond the sectors of medicine. As an example, the use of CRISPR tech in the field of agriculture to improve crop output in terms of yields and resilience9 has real-world applications, with the Chinese government approving the use of gene-edited crops this year.10 In the US, a biotech startup has promised the release of genetically edited crops this year to make more resilient produce, which will reduce food wastage significantly.11 Beyond biotechnology, CRISPR is set to revolutionize blockchain technologies as well. A combination of CRISPR simulation algorithms and AI will help users grow their non-fungible tokens (NFTs) – which are the building blocks of the metaverse – through interactions with their community.12 The emergence of these far-reaching applications, along with growing acceptance of the field, the boom in data availability, and increasing adoption of AI&ML technologies, has triggered an expansion in the global AI in genomics market – a CAGR of 48.44% is expected in the 2021–2027 period.13
Gene editing and CRISPR have largely been used as research tools thus far, but commercial applications are on the horizon. In the pharmaceutical industry, while CRISPR gene therapy has shown great promise against genetic disorders like β-thalassemia and SCD,14 mainstream applications are restricted to individuals with rare diseases who are not responding to any other forms of treatment. Outside of medical applications, the latest advances in genomics have seen CRISPR technology be utilized in creating genetically modified plants for the market, but without the decades of trial and error as is required for traditional genetic engineering. Some successful products from this application include CRISPR cabbage15 and the GABA tomato,16 the latter of which has been released in the Japanese market. This revolutionary technology is even helping identify traits useful for climate change adaptation in crops17 and biofuel production.18
Thus, the potential applications of gene editing, in combination with AI, are far-reaching, affecting conservation, agriculture, and personalized drug development and disease treatment. Further investment in digitization and gene research is crucial to continue along this path of “smart” health. Machine learning reduces the chances of errors, improving treatment and diagnostic prediction accuracy. Understanding and processing this data provides more avenues for AI to make a big impact, bringing us closer to a technology-powered future.