Commentary: To Win the AI Race, Invest in Talent Development
Op-ed piece for The Richmond Times-Dispatch by CCI Executive Director Luiz DaSilva, March 8, 2024
The key to winning the AI race will not be the data centers or power grids that we can build — it will be the talent that we can develop. In a field that is moving at breakneck speed, it is particularly important to form and recruit the professionals who will drive innovation in building and securing AI systems.

The race for dominance in AI is running full tilt, with a strong focus on infrastructure resources. First came news of Stargate, a $500 billion joint venture by Open AI, Oracle and Softbank, a development momentous enough to merit President Donald Trump’s presence at its announcement.
The case for this massive level of investment is to build the infrastructure required for major initiatives in AI. Only a couple of days later, the Chinese startup DeepSeek announced AI advances that are comparable to the current state-of-the-art solutions but using significantly more modest resources. Nvidia, the California-based manufacturer of AI chips, saw its valuation drop more than $600 billion on the possibility that the needs for investments in computing power may have been overestimated.
If the gains announced by DeepSeek are confirmed, it will be an example of how an unforeseen AI breakthrough can, in an instant, challenge the conventional wisdom about where the AI market is going. For the U.S. to lead the world in AI, the critical investment needs to be on talent that can produce such a disruption.
In my own corner of the vast AI landscape, as executive director of the Commonwealth Cyber Initiative, our focus is innovation in cybersecurity. There, AI plays in two major ways: it can be harnessed to improve the security and privacy of solutions, or it can open up new vulnerabilities in our systems. We must form talent that can develop the former and safeguard against the latter.
AI for cybersecurity is already firmly established. AI is routinely used to detect malicious activity: credit card companies use AI to flag unusual purchases, and system administrators use it to identify known attacks and combat “zero day” exploits on computer systems.
The flip side is that malicious players also develop attacks that target AI systems. For example, data poisoning attacks manipulate the data used to train an AI system so that it comes up with wrong outputs, or to create a backdoor that can be exploited for unauthorized access to information. AI can also be used to power ransomware attacks and social engineering attacks, through phishing that is increasingly sophisticated and targeted. And jailbreak attacks aim at bypassing security measures built around large language models, or LLMs.
In our network of Virginia universities, researchers are using generative AI to increase the security of iris biometrics, leveraging LLMs to analyze security vulnerabilities in software, and employing privacy-preserving learning for public health surveillance. We also run an incubator program that helps inventors to transition the results of this research into commercialization through startups or enterprises. To meet our growing AI cybersecurity challenges, we must follow the blueprint that we have laid out in Virginia and scale that investment throughout the entire nation.
The infrastructure that enables AI is of course important. But even more importantly, our country must educate a population that is well-versed in the development and the use of AI, its possibilities and limitations, its benefits and risks.
This needs to start at an early age, recognizing that today’s children will grow up in a world pervaded by AI. We must encourage and fund research on all aspects of AI, and pay particular attention to issues such as explainability, trust, and security. And we need to foster an entrepreneurship ecosystem that takes the best ideas developed by individuals and research teams and transitions them into products and services. Investing in people is the key to durable leadership in this new AI race.