home.social
  1. 🙌 Huge thanks to everyone who contributed to this journey from writing code, reviewing docs, to supporting governance and community growth.

    Stay tuned! We’ll be publishing a detailed announcement blog soon with more insights on what this means for users, contributors, and the future of model serving on Kubernetes.

    For now: thank you to the community for making this possible. 💙

    Kubeflow

  2. Tag them in the comments or share this post — let’s spread the word!

    If you are curious about what our teams are working on with open source communities, check out this newsletter that we just launched! inferenceops.substack.com/

  3. A huge thank you to Kevin Wang and Faseela K from the CNCF TOC for all the hard work. It’s been such a pleasure collaborating with you both on this milestone. Thank you to all the community members who have contributed!

    This is a big step for the KServe community, and we’re excited about the road ahead in making cloud-native model serving more accessible and production-ready for everyone.

    CNCF Kubernetes Kubeflow

  4. Big thanks to everyone contributing code, reviews, and ideas — this integration is shaping up to be a game-changer for 𝗞𝘂𝗯𝗲𝗿𝗻𝗲𝘁𝗲𝘀-𝗻𝗮𝘁𝗶𝘃𝗲 𝗟𝗟𝗠 𝘀𝗲𝗿𝘃𝗶𝗻𝗴. Stay tuned for next release!

  5. Get ready for KubeCon next week! Below are the three talks I'll be presenting! See you there! github.com/terrytangyuan/publi

    - Cloud Native AI Day Keynote: Advancing Cloud Native AI Innovation Through Open Collaboration, sponsored by Red Hat

    - Unlocking Potential of Large Models in Production with Adam Tetelman

    - WG Serving: Accelerating AI/ML Inference Workloads on Kubernetes with Eduardo Arango