Welcome, Llama 3.2, to Vertex AI! Here's what you need to know about Meta’s new generation of models now on Vertex AI Model Garden: - Llama 3.2 features two multimodal models (11B and 90B), enabling you to reason on high-resolution images. - Llama 3.2 also features two lightweight, text-only models (1B and 3B) designed for mobile and edge devices. - When building with Llama 3.2 on Vertex AI, you get easy access, robust developer tools, pay-as-you-go pricing, and fully managed infrastructure to deploy with confidence. Learn more → https://goo.gle/4gPf1jY
Every time I am in the clouds it is thanks to Google, thanks Google!
Now this sounds amazing!
Dual model design for localised on device inference and cloud compute of the larger models is starting to heat up, love this!
We would like to speak with you 🙌🏻 🇨🇱
An amazing and exciting collaboration
Excited to see what Llama 3.2 can do! Great work to Google Cloud and Meta on this partnership. Best wishes for a successful launch.
👍🏿
Waiting for next update
Exciting news! The introduction of Llama 3.2 to Vertex AI Model Garden opens up so many possibilities for developers and businesses. The combination of multimodal models for image reasoning and lightweight text-only models for edge devices makes this release incredibly versatile. Having access to these advanced models through Vertex AI’s managed infrastructure and pay-as-you-go pricing is a huge advantage—especially for scaling projects without worrying about backend complexity. Looking forward to exploring what these models can do, particularly in image analysis and edge-based AI applications. Thanks for sharing this update—ready to dive into Llama 3.2 on Vertex AI!