{"id":13911,"date":"2026-04-19T02:08:26","date_gmt":"2026-04-19T02:08:26","guid":{"rendered":"https:\/\/techtrendfeed.com\/?p=13911"},"modified":"2026-04-19T02:08:26","modified_gmt":"2026-04-19T02:08:26","slug":"a2ui-v0-9-the-new-customary-for-moveable-framework-agnostic-generative-ui","status":"publish","type":"post","link":"https:\/\/techtrendfeed.com\/?p=13911","title":{"rendered":"A2UI v0.9: The New Customary for Moveable, Framework-Agnostic Generative UI"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p><img decoding=\"async\" class=\"banner-image\" src=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/images\/Hero.original.jpg\" alt=\"Hero\"\/>  <\/p>\n<div class=\"inner-block-content rich-content\">\n<p data-block-key=\"6mowm\">Generative UI permits AI brokers to generate tailor-made UI widgets in real-time, matching the interface to the person\u2019s particular interplay. However to maneuver from demos to manufacturing, we want a clear separation of issues. A2UI v0.9 is our reply; a framework-agnostic customary for declaring UI intent. It permits native or distant brokers to speak with any consumer utility utilizing a standard language, guaranteeing your agent can generate your UI utilizing your present part catalog on any machine.<\/p>\n<h2 data-block-key=\"yu4pi\" id=\"\"><b>Brokers can \u201ccommunicate\u201d UI along with your design system, no want to vary.<\/b><\/h2>\n<\/div>\n<div class=\"inner-block-content video-block\">\n<p>        <video autoplay=\"\" loop=\"\" muted=\"\" playsinline=\"\" poster=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/wagtailvideo-4pjfmx5w_thumb.jpg\"><source src=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/a2ui-component-catalogs.mp4\" type=\"video\/mp4\"><p>Sorry, your browser does not help playback for this video<\/p>\n<p><\/source><\/video><\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<p data-block-key=\"5cw4u\">A2UI is designed to work on net, cell, and wherever else your customers are.<\/p>\n<h2 data-block-key=\"7zrmt\" id=\"what's-new-in-a2ui-v0.9\"><b>What&#8217;s New in A2UI v0.9<\/b><\/h2>\n<p data-block-key=\"ens6l\">This launch focuses on making it simpler than ever to construct brokers and combine along with your present frontends. This launch hardens our inner abstractions, simplifies streaming, and improves developer expertise.<\/p>\n<ul>\n<li data-block-key=\"e5sge\"><b>From &#8220;Customary&#8221; to &#8220;Primary&#8221;:<\/b> Frontend builders don&#8217;t desire new elements. They have already got a design system and elements they use. Brokers ought to reply dynamically, utilizing present entrance ends. We renamed our elective part set to &#8216;Primary&#8217; to make this extra clear. Try the <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2ui.org\/concepts\/catalogs\">part catalog<\/a> docs and code samples, join A2UI to your superior entrance ends.<\/li>\n<li data-block-key=\"ak6dn\"><b>A Shared Net-Core Library:<\/b> On the consumer aspect, we have launched a shared <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/github.com\/google\/A2UI\/tree\/main\/renderers\/web_core\">web-core<\/a> library which vastly simplifies any browser UI renderer. We have additionally landed the official React renderer and version-bumped all <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2ui.org\/reference\/renderers\/\">A2UI supported renderers<\/a> (Flutter, Lit, Angular, and React), whereas carving out a devoted spot for <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2ui.org\/ecosystem\/renderers\/\">neighborhood renderers<\/a>.<\/li>\n<li data-block-key=\"5dd65\"><b>The Agent SDK:<\/b> Constructing the agent aspect of the equation simply obtained loads simpler with <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2ui.org\/reference\/agents\/\">A2UI Agent SDK<\/a>. We\u2019ve optimized the era pipeline with new caching layers to make sure a high-performance, low-latency UI expertise.<\/li>\n<li data-block-key=\"7se9q\"><b>New Language Options:<\/b> A2UI 0.9 provides client-defined capabilities (excellent for validation), client-to-server knowledge syncing to help collaborative enhancing along with your agent, improved error dealing with, and a simplified, modular schema.<\/li>\n<li data-block-key=\"3t8s1\"><b>Simplified Transports:<\/b> We have refined our transport interfaces so connecting your brokers and purchasers is way smoother. A2UI over MCP, Websockets, REST, AG UI, A2A, or no matter you need.<\/li>\n<\/ul>\n<\/div>\n<div class=\"inner-block-content video-block\">\n<p>        <video autoplay=\"\" loop=\"\" muted=\"\" playsinline=\"\" poster=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/wagtailvideo-b95trh8r_thumb.jpg\"><source src=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/a2ui_scroller_Composer_2.mp4\" type=\"video\/mp4\"><p>Sorry, your browser does not help playback for this video<\/p>\n<p><\/source><\/video><\/p>\n<p>This instance exhibits a replay of streaming chunks, slowing down and replaying situations throughout renderers<\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<p data-block-key=\"5cw4u\">Including A2UI to any python agent is now a easy pip set up or uv add away (go and kotlin coming quickly).<\/p>\n<\/div>\n<div class=\"inner-block-content code-block line-numbers\">\n<pre><code class=\"language-shell\">pip set up a2ui-agent-sdk<\/code><\/pre>\n<p>\n        Shell\n    <\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<p data-block-key=\"5cw4u\">Integrating A2UI into your present agent is a simple 5-step course of. Right here\u2019s the &#8220;Hey World&#8221; of A2UI integration:<\/p>\n<\/div>\n<div class=\"inner-block-content code-block line-numbers\">\n<pre><code class=\"language-python\"># Step 1: Outline your catalog (primary or carry your personal) with elective examples&#13;\nmy_catalog = CatalogConfig.from_path(&#13;\n    title=\"<my_catalog_name>\",&#13;\n    catalog_path=(\"file:\/\/\/path\/to\/catalog.json\"),&#13;\n    # Elective: assist LLM with \"few-shot\" studying&#13;\n    examples_path=\"path\/to\/examples\/folder\/*.json\"&#13;\n),&#13;\n&#13;\n# Step 2: Initialize the Schema Supervisor to handle A2UI Spec variations&#13;\nschema_manager = A2uiSchemaManager(&#13;\n    model=\"0.9\",&#13;\n    catalogs=[my_catalog],&#13;\n)&#13;\n&#13;\n# 3. Generate the System Immediate, handles A2UI directions&#13;\nsystem_instruction = schema_manager.generate_system_prompt(&#13;\n    role_description=\"You're a useful assistant nice at producing UI...\",&#13;\n) &#13;\n&#13;\n# Step 4: Initialize your LLM Agent with the generated directions&#13;\nmy_agent = AnyAgentFrameworkLLMAgent(instruction=system_instruction, ...)&#13;\n&#13;\n# Step 5. Execute and Stream the UI&#13;\ndef handle_turn(user_query):&#13;\n    llm_response = my_agent.reply(user_query)&#13;\n&#13;\n    # In your executor the SDK helps parse, repair, and validate the LLM's JSON on the fly&#13;\n&#13;\n    selected_catalog = schema_manager.get_selected_catalog()&#13;\n    final_parts = parse_response_to_parts(llm_response, selected_catalog.validator)&#13;\n    yield {&#13;\n        \"is_task_complete\": True,&#13;\n        \"elements\": final_parts,&#13;\n    }<\/my_catalog_name><\/code><\/pre>\n<p>\n        Python\n    <\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<p data-block-key=\"6cj3z\"><b>Go Past the Fundamentals<\/b><\/p>\n<p data-block-key=\"uodc\">Whereas the instance above exhibits a easy static integration, the A2UI agent SDK is constructed for production-grade complexity. Out of the field, it helps:<\/p>\n<ul>\n<li data-block-key=\"787j1\"><b>Model Negotiation:<\/b> Dynamically choose the very best A2UI specification model based mostly on the consumer&#8217;s capabilities.<\/li>\n<li data-block-key=\"c9vkj\"><b>Dynamic Catalogs:<\/b> Swap between a number of catalog schemas at runtime to match particular person permissions or machine constraints.<\/li>\n<li data-block-key=\"ba2pu\"><b>Resilient Streaming:<\/b> Incrementally parse and heal LLM output, permitting the consumer to render UI elements as they&#8217;re being generated\u2014no ready for the total JSON block.<\/li>\n<\/ul>\n<p data-block-key=\"1bjb1\">Discover our<a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/github.com\/google\/A2UI\"> Agent Samples<\/a> to see these superior options in motion.<\/p>\n<p data-block-key=\"85rhb\">We&#8217;re engaged on some neat issues like higher MCP Apps integrations, progressive disclosure \u201cabilities\u201d for A2UI, human intent abstractions, PII help, and much more. Check out our up to date <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2ui.org\/roadmap\/\">roadmap<\/a> and remember to present us what <b>you<\/b> are engaged on.<\/p>\n<h2 data-block-key=\"r7ndr\" id=\"the-growing-generative-ui-ecosystem\"><b>The Rising Generative UI Ecosystem<\/b><\/h2>\n<p data-block-key=\"teb\">An ordinary is just nearly as good because the ecosystem round it, and the panorama is evolving quickly.<\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<ul>\n<li data-block-key=\"jvmvp\"><b>AG2:<\/b> Constructed by the creators of AutoGen, created <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/docs.ag2.ai\/latest\/docs\/user-guide\/reference-agents\/a2uiagent\/\">A2UIAgent<\/a> as a local integrations of A2UI, see it in motion within the above video.<\/li>\n<li data-block-key=\"9h0pi\"><b>A2A 1.0:<\/b> Agent-to-Agent (A2A) <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2a-protocol.org\/latest\/announcing-1.0\/\">1.0 protocol has formally launched<\/a>. It serves as a sturdy transport for distant brokers speaking with different brokers, or just connecting brokers on to frontends.<\/li>\n<li data-block-key=\"e5034\"><b>Vercel&#8217;s json-renderer:<\/b> Vercel not too long ago launched <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/json-render.dev\/docs\/a2ui\">json-renderer which helps A2UI<\/a> as a proof of idea. This might grow to be a devoted renderer for A2UI for the Vercel neighborhood.<\/li>\n<li data-block-key=\"77tjv\"><b>Oracle\u2019s Agent Spec:<\/b> not too long ago shipped <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/blogs.oracle.com\/ai-and-datascience\/announcing-agent-spec-for-a2ui-copilotkit-ag-ui\">Agent Spec + AG UI + A2UI + AG UI<\/a> help; Agent Spec defines what runs, AG\u2011UI carries the interplay, and A2UI defines what the person touches. Swap implementations at any layer whereas preserving the expertise secure.<\/li>\n<li data-block-key=\"ii5h\"><b>AG-UI:<\/b> Assist for connecting a broad spectrum of GenUI capabilities into agentic net apps, together with A2UI, MCP Apps and Open Generative UI.<\/li>\n<\/ul>\n<p data-block-key=\"696bo\">We&#8217;re seeing unbelievable implementations of A2UI throughout the business. Listed below are a number of latest sightings:<\/p>\n<h2 data-block-key=\"27504\" id=\"personal-health-companion\"><b>Private Well being Companion<\/b><\/h2>\n<p data-block-key=\"5mr1r\">The GenUI Private Well being Companion is an open-source app designed to get rid of &#8220;knowledge silos&#8221; and &#8220;navigation fatigue&#8221; by changing static dashboards with a modular, AI-driven interface. Developed by <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/rebelappstudio.com\/\">Insurgent App Studio<\/a>, <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/codemate.com\/creative-tech\/\">Codemate\u2019s<\/a> specialist Flutter group, this resolution leverages real-time knowledge orchestration to bridge the hole between fragmented medical data and wearable telemetry. Somewhat than forcing customers to dig by sub-menus, the app makes use of a central LLM-powered chat that may dynamically generate UI widgets on the fly, surfacing vital lab outcomes, vaccine expirations, or clinic areas based mostly on instant context. By grounding AI insights instantly within the person\u2019s distinctive well being knowledge, the app transforms passive well being monitoring right into a proactive, intent-driven assistant constructed for the trendy digital affected person.<\/p>\n<p data-block-key=\"9qimo\">Dive deeper into how the Well being Companion was constructed on <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/codemate.com\/articles\/flutter-genui-demo\/\">Codemate\u2019s weblog<\/a>\u2014and discover the <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/github.com\/rebelappstudio\/genui_personal_health_companion\">open-source demo on GitHub<\/a>.<\/p>\n<\/div>\n<div class=\"inner-block-content video-block\">\n<p>        <video autoplay=\"\" loop=\"\" muted=\"\" playsinline=\"\" poster=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/wagtailvideo-y6tl01h3_thumb.jpg\"><source src=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/frame-l.mp4\" type=\"video\/mp4\"><p>Sorry, your browser does not help playback for this video<\/p>\n<p><\/source><\/video><\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<h2 data-block-key=\"m475t\" id=\"personal-financial-planner\"><b>Private Monetary Planner<\/b><\/h2>\n<p data-block-key=\"89bcq\">The Life Aim Simulator is an interactive demonstration of how Generative UI bridges the hole between client expectations and the static experiences at present provided by the monetary providers business. Constructed by <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/verygood.ventures\/\">Very Good Ventures<\/a> (VGV)\u2014a Flutter and GenUI consultancy trusted by manufacturers like Toyota and GEICO\u2014the app strikes past conventional, one-size-fits-all interfaces by placing the person\u2019s life on the middle of the expertise. By choosing a persona and a purpose, corresponding to saving for retirement or a primary residence, customers hand the wheel to Gemini, which makes use of the Flutter GenUI SDK to generate a native-feeling, real-time UI from a curated catalog of interactive widgets like sliders, bar charts, and multi-selects.<\/p>\n<p data-block-key=\"13g7\">Try the <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/github.com\/VGVentures\/genui_life_goal_simulator\">open-source code<\/a> for this demo, and you can too see a <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/gcn26demo.vgv.ai\/\">dwell interactive demo<\/a> of this expertise.<\/p>\n<\/div>\n<div class=\"inner-block-content video-block\">\n<p>        <video autoplay=\"\" loop=\"\" muted=\"\" playsinline=\"\" poster=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/wagtailvideo-i5zaceos_thumb.jpg\"><source src=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/VGV_2.mp4\" type=\"video\/mp4\"><p>Sorry, your browser does not help playback for this video<\/p>\n<p><\/source><\/video><\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<h3 data-block-key=\"1rayq\" id=\"a2ui-with-any-agent-framework-(via-ag-ui)\"><b>A2UI with any Agent Framework (by way of AG-UI)<\/b><\/h3>\n<p data-block-key=\"bcgfc\">Any agent that already speaks AG-UI can drive A2UI v0.9 on day zero. No customized integration is required. This works by AG-UI&#8217;s middleware system: a small piece of code that plugs into your present agent pipeline. It teaches your agent the way to communicate A2UI, wires the responses appropriately, and handles streaming, changing the agent&#8217;s output into elements your UI can render instantly, utilizing A2UI&#8217;s built-in renderers or your personal customized elements.<\/p>\n<\/div>\n<div class=\"inner-block-content video-block\">\n<p>        <video autoplay=\"\" loop=\"\" muted=\"\" playsinline=\"\" poster=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/wagtailvideo-gm1t8nwe_thumb.jpg\"><source src=\"https:\/\/storage.googleapis.com\/gweb-developer-goog-blog-assets\/original_videos\/A2UI_-_AG-UI_Starter.mp4\" type=\"video\/mp4\"><p>Sorry, your browser does not help playback for this video<\/p>\n<p><\/source><\/video><\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<p data-block-key=\"0iayf\">Get this starter template working in your machine<\/p>\n<\/div>\n<div class=\"inner-block-content code-block line-numbers\">\n<pre><code class=\"language-shell\">npx copilotkit@newest create my-app --framework a2ui<\/code><\/pre>\n<p>\n        Shell\n    <\/p>\n<\/div>\n<div class=\"inner-block-content rich-content\">\n<h2 data-block-key=\"t3qsl\" id=\"get-started\"><b>Get Began<\/b><\/h2>\n<p data-block-key=\"35bqn\">Able to unshackle your brokers and allow them to drive your entrance finish with no matter elements you could have?<\/p>\n<p data-block-key=\"aid6n\">Try our new <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/a2ui-composer.ag-ui.com\/theater?scenario=restaurant-finder\">A2UI Theater<\/a> for a replay or dive into the <a rel=\"nofollow\" target=\"_blank\" href=\"http:\/\/a2ui.org\/\">A2UI.org<\/a> for docs, samples and dev guides to begin constructing versatile, moveable generative UIs as we speak.<\/p>\n<\/div><\/div>\n\n","protected":false},"excerpt":{"rendered":"<p>Generative UI permits AI brokers to generate tailor-made UI widgets in real-time, matching the interface to the person\u2019s particular interplay. However to maneuver from demos to manufacturing, we want a clear separation of issues. A2UI v0.9 is our reply; a framework-agnostic customary for declaring UI intent. It permits native or distant brokers to speak with [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":13913,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[56],"tags":[6944,8712,80,3745,3944,8711],"class_list":["post-13911","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-software","tag-a2ui","tag-frameworkagnostic","tag-generative","tag-portable","tag-standard","tag-v0-9"],"_links":{"self":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/13911","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=13911"}],"version-history":[{"count":1,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/13911\/revisions"}],"predecessor-version":[{"id":13912,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/13911\/revisions\/13912"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/media\/13913"}],"wp:attachment":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=13911"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=13911"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=13911"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}<!-- This website is optimized by Airlift. Learn more: https://airlift.net. Template:. Learn more: https://airlift.net. Template: 69d9690a190636c2e0989534. Config Timestamp: 2026-04-10 21:18:02 UTC, Cached Timestamp: 2026-04-19 04:54:46 UTC -->