Web Analytics

3 Latest Announced Rounds

  • $7,000,000
    Series A

    3 Investors

    Computer & Network Security
    Oct 10th, 2024
  • $5,000,000
    Series A

    1 Investors

    Information Services
    Oct 10th, 2024
  • $4,850,000
    Seed

    1 Investors

    Software Development
    Oct 10th, 2024
$1,420.35M Raised in 81 Funding Rounds in the past 7 Days - View All

Funding Round Profile

HPC-AI Tech

start up
  • 01/10/2024
  • Series A
  • $50,000,000

HPC-AI Tech is a startup focusing on improving the efficiency of enterprise AI model training and deployment. Before his startup launched, the founder had broken two world records for AI training speeds. In 2019, the most recent record, he and a team claimed the top BERT training speed, which is a benchmark used by Google and Microsoft, among others, to measure a machine's understanding of human languages. HPC-AI Tech is the company behind Colossal-AI, a cloud-native platform combining the ability to scale the most expensive AI workloads (e.g. GPT-3) with the simplicity of a laptop-like development environment. Colossal-AI is one of the fastest-growing open-source projects in scalable AI with 3,600 GitHub stars and is being used by hundreds of organizations globally, including IBM, Ant Group, Walmart, Oxford, OPPO, and Oracle. Only 8 months after it was founded, HPC-AI Tech currently has 10 key clients, including an automaker giant. The company has set a goal of 1,000 clients in three years with a 50% annual increase after that.
If you are interested, please email contact@hpcaitech.com


Related People

Yang YouFounder

Yang You United States - Berkeley, California

Yang You is a Presidential Young Professor at National University of Singapore. He is on an early career track at NUS for exceptional young academic talents with great potential to excel. He received his PhD in Computer Science from UC Berkeley. His advisor is Prof. James Demmel, who was the former chair of the Computer Science Division and EECS Department. Yang You's research interests include Parallel/Distributed Algorithms, High Performance Computing, and Machine Learning. The focus of his current research is scaling up deep neural networks training on distributed systems or supercomputers. In 2017, his team broke the world record of ImageNet training speed, which was covered by the technology media like NSF, ScienceDaily, Science NewsLine, and i-programmer. In 2019, his team broke the world record of BERT training speed. The BERT training techniques have been used by many tech giants like Google, Microsoft, and NVIDIA. Yang You’s LARS and LAMB optimizers are available in industry benchmark MLPerf. He is a winner of IPDPS 2015 Best Paper Award (0.8%), ICPP 2018 Best Paper Award (0.3%) and ACM/IEEE George Michael HPC Fellowship. Yang You is a Siebel Scholar and a winner of Lotfi A. Zadeh Prize. Yang You was nominated by UC Berkeley for ACM Doctoral Dissertation Award (2 out of 81 Berkeley EECS PhD students graduated in 2020). He also made Forbes 30 Under 30 Asia list (2021) and won IEEE CS TCHPC Early Career Researchers Award for Excellence in High Performance Computing. For more information, please check https://www.forbes.com/profile/you-yang/