Gregg Barrett emphasizes the crucial role of AI in South Africa’s economic future and the urgency for a comprehensive national strategy to address digital inequality and infrastructure challenges.

 

Featured Article by Gregg Barrett – “The Hard Power Economics of AI for South Africa” The article first appeared in Business Day


 

OpenAI recently introduced the latest in its GPT series of large language (LLM) models to widespread interest. GPT stands for Generative Pre-Trained Transformer. Generative describes a class of statistical models that can generate new data instances. This could be a single pixel to a full image, or a single word to a complete paragraph. Pre-Trained refers to the model parameters (the models’ weights and biases) being fine-tuned following many training runs on the training data. Transformer refers to the model architecture first described in a 2017 paper from Google. Transformers use an attention mechanism to draw global dependencies between input and output, such as tracking relationships in sequential data like the words in a sentence. Nowadays people use transformers every time they search on Google or Bing.

In 2021 Stanford researchers termed transformers ‘foundation models’ – a description of models trained on broad data at scale, that can be fine turned to a wide range of downstream tasks. OpenAI’s GPT-3 released in June 2020, was 175 billion parameters, trained on 570GB of filtered text data, and used 10,000 GPUs (Graphics Processing Unit – chips used for AI) for its training. In October 2021, Microsoft and NVIDIA announced the Megatron-Turing Natural Language Generation model with 530 billion parameters. At the time the world’s largest and most powerful generative language model trained on a combination of 15 datasets and a training process that required thousands of GPUs running for weeks.

While the GPT series of LLM’s is what most people may be familiar with, AI has brought about a significant impact on science where it has gone from modelling language to modelling the relationships between atoms in real-world molecules. Large models built on large datasets with large scale accelerated computing, transformers are used to make accurate predictions and generate new data instances that drive their wider use, a virtuous cycle generating more data that can be used to create ever better models. Scientific fields from drug design to materials discovery are being catalysed by these massive data sets and powerful models. For example, NVIDIA and the University of Florida’s academic health center collaborated to create a transformer model named GatorTron to extract insights from large volumes of clinical data to accelerate medical research. Generative models are being used to generate more-robust synthetic controls for clinical trials using data from different modalities in cases in which patient recruitment or retention is difficult.

London based DeepMind developed a transformer called AlphaFold2 which processes amino acid chains like text strings to accurately predict protein structure. The curation of open-source protein and metagenomics data sets were pivotal in enabling the training of AlphaFold, which in turn enabled new data sets of predicted protein structures for almost all known proteins. The work is now being used to advance drug discovery with DeepMind establishing a sister company, Isomorphic Labs to pursue it. Using AlphaFold together with an AI-powered drug discovery platform researchers led by the University of Toronto Acceleration Consortium were able to design and synthesize a potential drug to treat hepatocellular carcinoma (HCC), the most common type of primary liver cancer. Traditionally drug discovery has relied on trial-and-error methods of chemistry that in comparison to AI driven methods are slow, expensive and limit the scope of exploration of new medicines. In the case of the potential HCC drug, it took just 30 days from target selection and after only synthesizing seven compounds. These AI-powered drug discovery platforms work in concert with self-driving laboratories, an emerging technology that combines AI, lab automation, and advanced computing.

In South Africa health is an obvious example of a significant economic and societal opportunity where the technology needs to serve as the centrepiece of a new healthcare model, to drive down costs, increase access, and radically improve outcomes. The application of AI in this context will provide predictive, preventative, personalised care and will help to reduce demand. Data is critical to unlocking the benefits of AI, with large, diverse and multi-modal data (i.e. radiology, digital pathology) required to transition from the research setting into everyday clinical practice. However, patient, and other important data are stored in silos, across different hospitals, universities, companies, and research centres. This is not an area in which the government can sit on the sidelines. A major reset in thinking is required.


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


 

To date publicly available data has been responsible for advances in AI. For example, Common Crawl enabled progress in language models, and ImageNet drove progress in computer-vision. These data sets were relatively cheap to produce – in the region of hundreds of thousands of dollars – but have generated spillover value into the billions of dollars. AI has worked particularly well in pathology (cancer/MRI images) and some experts consider it better than trained doctors. In the UK, Biobank is helping to produce the world’s largest organ-imaging data set, based on 60,000 participants to assess disease progression, and in Europe the European Health Data Space (EHDS) has been established to help create standards, improve interoperability, and allow access to data for research.

For South Africa it is now a once-in-a-generation opportunity to revolutionise health with the implementation of a single national health infrastructure that brings together data into a world-leading system. This would facilitate a single centralised electronic health records (EHR) system. It would ensure a common standard of EHR’s across the health system, common data collection, data standards and interoperability, ensuring the benefits of a connected data system to be fully realised. A single database would also be able to connect more easily with external systems to share data in trusted research environments, platforms providing data from devices such as wearables, and clinical-trials-management systems.

Operationally the data management platform behind the EHR would cost less than 2 billion rand, be operational within a year, with rollout across the entire healthcare system taking place inside three years. The spillover would be significant. In 2019, Ernst & Young estimated that unlocking NHS health data could be worth up to £10 billion per annum through operational efficiencies, improved patient outcomes and wider economic benefits. In addition, collaborating with life sciences firms to access national health data, where appropriate, would generate funding for greater investment in public research and development.

AI offers the opportunity to catalyse a radically different future for South Africa, a national purpose that is bold and optimistic, that embraces technology to restore science and research, help citizens live longer, healthier lives and create new industries with meaningful employment. However, achieving this economic transformation requires a generational change in how work and innovation takes place. Capturing the opportunity in AI is a marathon, not a sprint. The winners are those who can effectively frame problems as AI problems, combine engineering with building the requisite hardware, software, and data architecture, to drive innovation.

The magnitude of the undertaking to transform South Africa from a consumer of AI to a producer cannot be overstated. OpenAI’s GPT-4 and successors are being trained on tens of thousands of the highest specification GPUs for months on end. There are less than 500 such top specification GPU’s on the African continent, meaning that to train a single model, a private lab in California is using at least 50 times the total compute capacity available on the entire African continent. To help countries plan for AI compute capacity the OECD has published the first blueprint on AI compute. Canada and the United Kingdom have also begun needs assessments for compute infrastructure more broadly, but planning for specialised AI compute needs across the economy remains a major policy gap with South Africa having not undertaken any such assessment.

A recent report by the UK Government Office for Science noted that many smaller research centres and businesses in the UK were having difficulty gaining access to large-scale computing platforms, which curtailed the scope of their AI development. Likewise, a 2020 study found that an increased need for specialised computational infrastructure and engineering can result in ‘haves and have-nots’ in a scientific field. “We contend that the rise of deep learning increases the importance of compute and data drastically, which, in turn, heightens the barriers of entry by increasing the costs of knowledge production.” the paper reads.

According to the November 2022 Top500 list, there are only 34 countries in the world with a “top supercomputer”. While the list does not capture all the top systems in the world it does serve as a proxy for the countries with capability and those without. It is primarily the top 50 systems on the list which have real capability from an AI and science standpoint and that are used predominately for such. Apart from leading countries (the US, China, countries from the EU27, the UK, and Japan), the rest of the world makes up 12% of the supercomputers on the list, with countries from the Global South sparsely represented, and South Africa completely absent.

Image credit: Oak Ridge National Laboratory

Frontier’s exascale capability was leveraged by researchers at the US Department of Energy’s Oak Ridge National Laboratory to perform a large-scale scan of biomedical literature in order to find potential links among symptoms, diseases, conditions and treatments, understanding connections between different conditions and potentially leading to new treatments. The system able to search more than 7 million data points from 18 million medical publications in only 11.7 minutes. The study identified four sets of paths for further investigation through clinical trials.

Consider Anton 3, a supercomputer specially designed for atomic-level simulation of molecules relevant to biology (e.g., DNA, proteins, and drug molecules) and used for drug discovery.  The system developed on the side-line of D.E Shaws primary hedge fund business is not listed in the Top500 but would easily rank in the top 50, sits far beyond anything on the African continent.  Put differently, without such systems and engineering in South Africa, how are researchers and institutions expected to undertake modern drug discovery? They cannot, and this extends to all fields now driven by AI from climate to materials.

As Jack Clark co-founder of Anthropic recently pointed out, beyond the technological aspects GPT-4 should be viewed as the rendering of hard power economics in computational form. A capable data transformation engine and knowledge worker whose engineering and parameterisation is controlled and owned by a single private company. It is indicative of how contemporary AI research is very expensive and that these operations should be thought of more as capital-intensive factory style businesses than SaaS (Software-as-a-Service) companies. For example, Adept an AI startup, is training large-scale generative models to take actions on computers. Through a set of English commands you can imagine Adept taking data from one application and loading it into another, or carrying out multi-step actions in a spreadsheet. Adept raised $350 million dollars in a Series B funding earlier this year.

AI is unquestionably going to have a bearing on economic life and cause societal changes – an obvious example is irrevocable changes in how education works that OpenAI’s ChatGPT has already led to.  In South Africa a paradigm shift is required that tourism and natural resources are not an economic future for the country, but a collective belief that technology is crucial to building wealth. South Africa reaping the economic rewards from AI will not magically fall into place, nor will the country simply over time evolve into a technology producer. Rather, OpenAI’s GPT-4 is an alarm bell that South Africa is falling so far behind that it risks never being able to catchup. Without retooling to capture the economic gains, AI will accentuate the ‘haves and have-nots’. According to the World Bank at last count the number of poor people in Sub-Saharan Africa had risen, and South Africa’s real GDP growth is projected to decelerate sharply to 0.1 percent in 2023, based on the latest estimates by the IMF.

While there are many challenges to confront, we need not be pessimistic about our future. Ahead of us is the opportunity to set a future for South Africa – it’s an opportunity we must seize. The South African people need a new national purpose – one that is bold, optimistic, and embraces technology. Identifying technological opportunities presented by AI, such as those in healthcare with EHR, and expanding access in an appropriate and safe way, South Africa could establish a competitive edge over other health systems, providing invaluable data sets that could drive the needed progress in life sciences to deliver novel diagnostics and treatments. Ultimately it is that which sits beyond AI – the principles of ambition, invention and compassion that characterise our collective spirit – that we now need to summon to drive lasting impact and a future for the country.


About the Author:

Gregg Barrett is a seasoned executive with extensive and diverse experience in strategy, building and managing relationships, deal-making, communication, developing high-performance teams, organisational leadership, and problem-solving across a range of areas. Over the last decade, Gregg has led work in data science, machine learning, corporate research, and corporate venture capital. This includes the establishment and management of data science, machine learning, corporate research, and corporate venture capital operations, working across people, processes, and technology, integrating structured and unstructured data to direct research, business, and investment strategy.