![]() |
Hongxia Yang The Hong Kong Polytechnic University Keynote Title: |
Time & Venue
9:00 - 10:00, 6 December 2024 (Friday)
Room TU201, PolyU
Abstract
The prevailing GPU resource monopoly significantly restricts AI development, confining participation in the pretraining stages of Large Language Models (LLMs) to a few researchers. This project introduces a novel system that integrates hundreds of domain-specific models to construct a foundational model for Artificial General Intelligence (AGI) with minimal computational demand. By employing smaller, efficient models, leveraging top-ranked models across diverse domains through a robust ranking algorithm, and continuously optimizing the evolving foundation model, this approach seeks to democratize AI development. It shifts from the traditional 'model over data' method to a 'model over models' strategy, aiming to reduce reliance on extensive computational resources and promote broader innovation and inclusivity in AI.
Biography
Prof. Hongxia Yang, with over 15 years of experience as an AI scientist, specializes in large-scale machine learning, data mining, and deep learning. Throughout her career, she has developed 10 significant algorithmic systems, improving the operations of various enterprises. Her research includes pre-trained models, big data analytics, and the practical deployment of large language model(LLM) systems in real settings. Prof. Yang has published more than 100 top-tier papers, amassed around 10K citations with an H-index of 44, and holds over 50 patents. She has received several awards, including the 2019 SAIL Award at the World Artificial Intelligence Conference and the 2020 National Science and Technology Progress Award, China’s top tech accolade. Named one of Forbes China’s Top 50 Women in Tech in 2022 and AI 2000 Most Influential Scholar Award in 2023-2024, Prof. Yang has held prominent roles at ByteDance US, Alibaba Group, Yahoo! Inc, and IBM T.J. Watson Research Center. She earned her PhD from Duke University and her B.S. from Nankai University.