Group to collaborate on high-performance, sustainable and more economical solutions for AI inference
The AI Platform Alliance announced today the expansion of its consortium aimed at combining the key components required to operate a modern AI compute service with more open, economical and sustainable solutions. Formed last year at the Open Compute Conference, the group was initially comprised of AI accelerator companies. The Alliance has now expanded to include cloud managed service providers, system suppliers and integrators, and ISVs, reflecting a maturing ecosystem focused on providing solutions for the most demanding AI inference use cases.
The evolving Alliance ecosystem has focused on providing practical and easily adoptable solutions through a new marketplace now available on the AI Platform Alliance website. The solutions offered by Alliance members increase both the power and cost efficiency of AI inference while delivering better overall performance than more commonly seen solutions featuring GPUs today.
New companies joining the AI Platform Alliance include ADLINK, ASRock Rack, ASA Computers, Canonical, Clairo.ai, Deepgram, DeepX, ECS/Equus, GIGABYTE (Giga Computing), Kamiwaza.ai, Lampi.ai, NETINT, NextComputing, opsZero, Positron, Prov.net/Alpha3, Responsible Compute, Supermicro, Untether, View IO, and Wallaroo.ai. These companies join founding members that included Ampere Computing®, Cerebras Systems, Furiosa, Graphcore, Kalray, Kinara, Luminous, Neuchips, Rebellions and Sapeon. Membership is comprised of more than 30 organizations spanning five key sectors of the industry supplying products and services to the burgeoning AI inference industry.
The AI Platform Alliance was formed specifically to promote better collaboration and openness when it comes to AI. This solidarity of vision comes at a pivotal moment not just for the technology industry, but for the world at large. The explosion of AI has created unprecedented demand for compute power to not only run AI algorithms, but also to pull together all the systems, applications and services required to implement a modern AI-enabled digital service. While solutions to date have mainly addressed AI training of ever more powerful models, AI inference can require up to 10x more traditional compute support processes to run a complex AI-enabled service. These stacks require an ecosystem of technology, services, and applications working together seamlessly to integrate best in class ingredients and easy to adopt recipes to scale AI inference use cases.
AI Platform Alliance members will work together to validate joint AI solutions that provide a diverse set of alternatives to vertically oriented GPU-based status quo platforms. By developing these solutions as a community, this group will accelerate the pace of AI innovation while making AI platforms more open and transparent, increasing the capacity of AI to solve real-world problems, accelerating the rate of practical adoption, and delivering environmentally friendly and socially responsible infrastructure at scale.
Various members of the AI Platform Alliance are expected to showcase solutions at Yotta 2024 in Las Vegas October 7-9. The AI Platform Alliance is open today to potential new members looking to change the AI status quo. Companies interested in joining can access more information and apply at https://platformalliance.ai.
About Ampere
Ampere is a modern semiconductor company designing the future of cloud computing with the world's first Cloud Native Processors. Built for the sustainable Cloud with the highest performance and best performance per watt, Ampere processors accelerate the delivery of all cloud computing applications. Ampere Cloud Native Processors provide industry-leading cloud performance, power efficiency and scalability. For more information, visit Ampere Computing.