Taiwan: Taiwan’s Foxconn, the world’s largest contract electronics manufacturer, has launched its first large language model, FoxBrain, aiming to enhance manufacturing processes and supply chain management.
Foxconn announced that FoxBrain was developed using 120 Nvidia H100 GPUs, completing training within four weeks. The AI model is based on Meta’s Llama 3.1 architecture and is optimized for traditional Chinese and Taiwanese language styles, making it Taiwan’s first reasoning-capable large language model.
Initially designed for internal use, FoxBrain offers capabilities in data analysis, decision support, document collaboration, problem-solving, mathematics, reasoning, and code generation.
Foxconn plans to collaborate with technology partners to expand AI applications, share open-source information, and drive innovation in manufacturing and supply chain management.
While FoxBrain has a slight performance gap compared to China’s DeepSeek distillation model, Foxconn says its overall capabilities are close to world-class standards.
Foxconn credited Nvidia’s Taipei-1 supercomputer, the largest in Taiwan, for its role in training and technical support during model development. Taipei-1, owned and operated by Nvidia in Kaohsiung, played a crucial role in refining FoxBrain’s AI capabilities.
Foxconn will reveal further details about FoxBrain at Nvidia’s GTC developer conference in mid-March, showcasing its potential to revolutionize smart manufacturing and supply chain decision-making.
With this launch, Foxconn strengthens its position as a leader in AI-driven manufacturing, leveraging cutting-edge technology to enhance efficiency and drive the future of intelligent automation.