IBM Research

IBM Research

Research Services

Yorktown Heights, New York 75,871 followers

Inventing what's next in science and technology. Subscribe to our newsletter for the latest: https://ibm.biz/BdMdCb

About us

IBM Research is a group of researchers, scientists, technologists, designers, and thinkers inventing what’s next in computing. We’re relentlessly curious about all the ways that computing can change the world. We’re obsessed with advancing the state of the art in AI and hybrid cloud, and quantum computing. We’re discovering the new materials for the next generation of computer chips; we’re building bias-free AI that can take the burden out of business decisions; we’re designing a hybrid-cloud platform that essentially operates as the world’s computer. We’re moving quantum computing from a theoretical concept to machines that will redefine industries. The problems the world is facing today require us to work faster than ever before. We want to catalyze scientific progress by scaling the technologies we’re working on and deploying them with partners across every industry and field of study. Our goal is to be the engine of change for IBM, our partners, and the world at large.

Website
http://www.research.ibm.com/
Industry
Research Services
Company size
10,001+ employees
Headquarters
Yorktown Heights, New York

Updates

  • Today, IBM announces a major breakthrough in co-packaged optics that will bring the speed of light to generative AI: https://lnkd.in/gyD94Aua IBM Researchers have invented a way for fiber optics to connect chips on a circuit board, which promises to improve energy efficiency, boost bandwidth and accelerate generative #AI computing development.   This co-packaged optics technology introduces an all-new blueprint for how we transmit information. By enabling chipmakers to add six times as many optical fibers at the edge of a chip (a measure called “beachfront density”) compared to the current state-of-the-art, IBM’s optical structures have the potential to massively boost the bandwidth between chips.    As IBM distinguished engineer John Knickerbocker said about his team’s development of this new technology:     “Even the most capable semiconductor components are only as fast as the connections between them.”  ---- #IBM #Reserach #Chips #Semiconductors #FiberOptics #DataCenter

    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
  • View organization page for IBM Research, graphic

    75,871 followers

    IBM Research announces the release of several new open-source AI models for materials discovery: https://lnkd.in/gqJk4Y_G ⚛️ - Aimed at accelerating the search for more sustainable materials with applications in chip fabrication, clean energy, and consumer packaging, these foundation models pre-trained on vast molecular databases can be used to screen millions of molecules at a time for desirable properties while weeding out the ones with dangerous side-effects. 📚 - Explore the models and the new working group for materials that IBM has launched through The AI Alliance in the link above. Start building with IBM’s Foundation Models for Materials (FMF4) below: 👾 - GitHub: https://lnkd.in/guUr9kng 🤗 - Hugging Face: https://lnkd.in/gURHJ66f 𝗪𝗵𝗮𝘁'𝘀 𝗻𝗲𝘅𝘁? IBM researchers will demo the foundation models at the upcoming Association for the Advancement of Artificial Intelligence (AAAI) conference in February. In the coming year, they plan to release new #fusion techniques and models built on additional data modalities, including the positioning of atoms in 3D space. Stay tuned. ---- #IBM #AI #Chemistry #Sustainability

    • Meet IBM’s new family of AI models for materials discovery
  • IBM Research reposted this

    Ever wanted to check out one of our labs? Join Jerry Chow, IBM Fellow and Director of Quantum Systems, as he walks through the IBM Quantum characterization lab, where quantum chips – like Falcon and Heron – and the systems that power them are tested before deployment. In this tour, we also visit quantum researcher Daniela Bogorin for an introduction of the ultra-cold (0.050 millikelvin) dilution refrigerator where she and her team can accurately capture quantum processor key performance indicators – like qubit frequency, qubit coherence and flux coupler tuning, among others – to determine readiness for client system deployment.

  • IBM Research reposted this

    View profile for Armand Ruiz, graphic
    Armand Ruiz Armand Ruiz is an Influencer

    VP of Product - AI Platform @IBM

    🔥Hot off the press: Our final announcement of the year — introducing Granite 3.1, a set of workhorse LLMs built for enterprise-scale AI challenges To level set, let me first explain what IBM Granite is. Granite is IBM's family of foundation models & LLMs designed for businesses to scale their AI applications. Granite models are open-source and can be used for various enterprise tasks. What's new with this 3.1 version? 1/ Granite 3.1 8B Instruct delivers significant performance improvements over Granite 3.0 8B Instruct. Its average score across the Hugging Face OpenLLM Leaderboard benchmarks is now among the highest of any open model in its weight class. 2/ We’ve expanded the context windows of the entire Granite 3 language model family. Our latest dense models (Granite 3.1 8B, Granite 3.1 2B), MoE models (Granite 3.1 3B-A800M, Granite 3.1 1B-A400M) and guardrail models (Granite Guardian 3.1 8B, Granite Guardian 3.1 2B) all feature a 128K token context length. 3/ We’re releasing a family of all-new embedding models. The new retrieval-optimized Granite Embedding models are offered in four sizes, ranging from 30M–278M parameters. Like their generative counterparts, they offer multilingual support across 12 different languages: English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch and Chinese. 4/ Granite Guardian 3.1 8B and 2B feature a new function calling hallucination detection capability, allowing increased control over and observability for agents making tool calls. 5/ All Granite 3.1, Granite Guardian 3.1, and Granite Embedding models are open source under Apache 2.0 license. 6/ These latest entries in the Granite series follow IBM’s recent launch of Docling (an open-source framework for prepping documents for RAG and other generative AI applications) and Bee (an open-source, a model-agnostic framework for agentic AI). 7/ Granite TTM (TinyTimeMixers), IBM’s series of compact but highly performant timeseries models, are now available in watsonx.ai through the beta release of watsonx.ai Timeseries Forecasting API and SDK. 8/ Granite 3.1 models are now available in IBM watsonx.ai and through platform partners, including (in alphabetical order) Docker, Hugging Face, LM Studio, Ollama, and Replicate. 9/ Granite 3.1 will also be leveraged internally by enterprise partners: Samsung is integrating select Granite models into its SDS platform; Lockheed Martin is integrating Granite 3.1 models into its AI Factory tools, used by over 10,000 developers and engineers. Learn more here: - Official Announcement Blog: https://lnkd.in/gY8feQwh - Try the models in the Granite Model Playground: https://lnkd.in/gY93aw6p - Tutorial implementing agentic RAG locally with Ollama, Open WebUI and Granite 3.1: https://lnkd.in/gpwA2KHJ - Model on Hugging Face: https://lnkd.in/gr2Zir36 This holiday season... let’s roll up our sleeves and start building! 🤓

    • No alternative text description for this image
  • IBM Research reposted this

    View organization page for IBM, graphic

    17,686,821 followers

    IBM Granite is getting stronger and shipping faster. 🔥 Today, we’re releasing IBM Granite 3.1 – the latest update to our Granite series of open and performant language models, delivering new features and significant performance improvements. What's new? We upped the context windows to 128K for all models. We added new function calling hallucination detection capabilities so users can have more control over agentic #AI workflows. Plus, we’re introducing 4 new Granite embedding models that offer multilingual support across 12 different languages. Dive into the full details here: https://ibm.biz/BdGXBh

    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
  • IBM Research reposted this

    View profile for Dario Gil, graphic
    Dario Gil Dario Gil is an Influencer

    IBM Senior Vice President and Director of Research.

    Today, I’m proud to introduce Granite 3.1 – another step forward in IBM's mission to deliver high-performing, enterprise-ready open source AI models. With every release, we’re getting better and faster – putting out the building blocks businesses need to build reliable, cost-effective AI in a safe, responsible, and scalable way. We’re seeing significant performance improvement with this release – our 8B Instruct model is now topping the Hugging Face OpenLLM V2 leaderboard and competitive with leading lightweight models from Meta, Google, Qwen, and Mistral AI. We’ve expanded the context windows to 128K tokens across the entire Granite family — dense, MoE, and guardrail models — so they can retain and work with long-form content for better performance on general AI tasks. Our Granite Guardian models offer optimal observability with new hallucination detection capability for function-calling, enabling greater reliability and precision in agentic workflows. And above all, we continue to ship under the permissive Apache 2.0 license – reinforcing our belief that AI is nothing if it’s not open, collaborative, and transparent.

    IBM wants to be the enterprise LLM king with its new open-source Granite 3.1 models

    IBM wants to be the enterprise LLM king with its new open-source Granite 3.1 models

    https://venturebeat.com

  • IBM Research reposted this

    Samsung and IBM Research are working together to help shape the future of #computing.   Samsung’s R&D center is collaborating with IBM Research to develop cutting-edge technology that could overcome the limitations of today's copper (Cu) based interconnects: a new post-Cu alternative interconnect technology that has the potential to improve performance and reliability. This innovative development, which was presented at the #IEDM conference last week, is vital for future #CMOS technologies.   Read more here: https://lnkd.in/dpv35wQh   #SamsungSemiUS

    Copper evolution and beyond: Developments in advanced interconnects for future CMOS nodes

    Copper evolution and beyond: Developments in advanced interconnects for future CMOS nodes

    research.ibm.com

  • View organization page for IBM Research, graphic

    75,871 followers

    Featured in WIRED, IBM Research Chief Scientist Ruchir Puri illustrates IBM’s open-source approach to enterprise #AI models — underscoring the business value of data transparency, data protection and the flexibility afforded by smaller, tailored foundation models. Read our sponsored blog here: https://lnkd.in/g3_mycx2 “I believe open models and an open ecosystem will win in business environments.” --- #IBM #Research #InstructLab #Granite

    • No alternative text description for this image

Affiliated pages

Similar pages