{"id":13131,"date":"2026-03-27T02:04:38","date_gmt":"2026-03-27T02:04:38","guid":{"rendered":"https:\/\/techtrendfeed.com\/?p=13131"},"modified":"2026-03-27T02:04:39","modified_gmt":"2026-03-27T02:04:39","slug":"getting-began-with-smolagents-construct-your-first-code-agent-in-15-minutes","status":"publish","type":"post","link":"https:\/\/techtrendfeed.com\/?p=13131","title":{"rendered":"Getting Began with Smolagents: Construct Your First Code Agent in 15 Minutes"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div id=\"post-\">\n<p>    <center><img decoding=\"async\" alt=\"Getting Started with smolagents: Build Your First Code Agent in 15 Minutes\" width=\"100%\" class=\"perfmatters-lazy\" src=\"https:\/\/www.kdnuggets.com\/wp-content\/uploads\/Getting-Started-with-Smolagents-Build-Your-First-Code-Agent-in-15-Minutes.png\"\/><br \/><span>Picture by Writer<\/span><\/center><br \/>\n\u00a0<\/p>\n<h2><span>#\u00a0<\/span>Introduction<\/h2>\n<p>\u00a0<br \/>AI has moved from merely chatting with <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/www.kdnuggets.com\/large-language-models-beginners-guide-2025\" target=\"_blank\">massive language fashions<\/a><\/strong> (LLMs) to giving them legs and arms, which permits them to carry out actions within the digital world. These are sometimes referred to as Python AI brokers \u2014 autonomous software program packages powered by LLMs that may understand their surroundings, make choices, use exterior instruments (like APIs or code execution), and take actions to attain particular objectives with out fixed human intervention.<\/p>\n<p>When you have been eager to experiment with constructing your individual AI agent however felt weighed down by advanced frameworks, you&#8217;re in the appropriate place. Right now, we&#8217;re going to take a look at <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/huggingface.co\/docs\/smolagents\/index\" target=\"_blank\">smolagents<\/a><\/strong>, a robust but extremely easy library developed by <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/huggingface.co\/\" target=\"_blank\">Hugging Face<\/a><\/strong>.<\/p>\n<p>By the top of this text, you&#8217;ll perceive what makes smolagents distinctive, and extra importantly, you&#8217;ll have a functioning code agent that may fetch dwell knowledge from the web. Let&#8217;s discover the implementation.<\/p>\n<p>\u00a0<\/p>\n<h2><span>#\u00a0<\/span>Understanding Code Brokers<\/h2>\n<p>\u00a0<br \/>Earlier than we begin coding, let&#8217;s perceive the idea. An agent is basically an LLM outfitted with instruments. You give the mannequin a aim (like &#8220;get the present climate in London&#8221;), and it decides which instruments to make use of to attain that aim.<\/p>\n<p>What makes the Hugging Face brokers within the smolagents library particular is their strategy to reasoning. In contrast to many frameworks that generate JSON or textual content to determine which instrument to make use of, smolagents brokers are code brokers. This implies they write Python code snippets to chain collectively their instruments and logic.<\/p>\n<p>That is highly effective as a result of code is exact. It&#8217;s the most pure option to categorical advanced directions like loops, conditionals, and knowledge manipulation. As an alternative of the LLM guessing the best way to mix instruments, it merely writes the Python script to do it. As an open-source agent framework, smolagents is clear, light-weight, and excellent for studying the basics.<\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Conditions<\/h4>\n<p>To comply with alongside, you will want:<\/p>\n<ul>\n<li>Python information. You ought to be comfy with variables, features, and pip installs.<\/li>\n<li>A Hugging Face token. Since we&#8217;re utilizing the Hugging Face ecosystem, we&#8217;ll use their free inference API. You may get a token by signing up at <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/huggingface.co\/\" target=\"_blank\">huggingface.co<\/a><\/strong> and visiting your settings.<\/li>\n<li>A Google account is non-obligatory. If you don&#8217;t want to put in something domestically, you&#8217;ll be able to run this code in a <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/colab.research.google.com\/\" target=\"_blank\">Google Colab<\/a><\/strong> pocket book.<\/li>\n<\/ul>\n<p>\u00a0<\/p>\n<h2><span>#\u00a0<\/span>Setting Up Your Atmosphere<\/h2>\n<p>\u00a0<br \/>Let&#8217;s get our workspace prepared. Open your terminal or a brand new Colab pocket book and set up the library.<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>mkdir demo-project&#13;\ncd demo-project<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>Subsequent, let&#8217;s arrange our safety token. It&#8217;s best to retailer this as an surroundings variable. If you&#8217;re utilizing Google Colab, you need to use the secrets and techniques tab within the left panel so as to add <code style=\"background: #F5F5F5;\">HF_TOKEN<\/code> after which entry it through <code style=\"background: #F5F5F5;\">userdata.get('HF_TOKEN')<\/code>.<\/p>\n<p>\u00a0<\/p>\n<h2><span>#\u00a0<\/span>Constructing Your First Agent: The Climate Fetcher<\/h2>\n<p>\u00a0<br \/>For our first venture, we&#8217;ll construct an agent that may fetch climate knowledge for a given metropolis. To do that, the agent wants a instrument. A instrument is only a perform that the LLM can name. We&#8217;ll use a free, public API referred to as <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/wttr.in\/\" target=\"_blank\">wttr.in<\/a><\/strong>, which gives climate knowledge in JSON format.<\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Putting in and Setting Up<\/h4>\n<p>Create a digital surroundings:<\/p>\n<p>\u00a0<\/p>\n<p>A digital surroundings isolates your venture&#8217;s dependencies out of your system. Now, let&#8217;s activate the digital surroundings.<\/p>\n<p>Home windows:<\/p>\n<p>\u00a0<\/p>\n<p>macOS\/Linux:<\/p>\n<p>\u00a0<\/p>\n<p>You will notice <code style=\"background: #F5F5F5;\">(env)<\/code> in your terminal when lively.<\/p>\n<p>Set up the required packages:<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>pip set up smolagents requests python-dotenv<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>We&#8217;re putting in smolagents, Hugging Face&#8217;s light-weight agent framework for constructing AI brokers with tool-use capabilities; <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/pypi.org\/project\/requests\/\" target=\"_blank\">requests<\/a><\/strong>, the HTTP library for making API calls; and <strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/pypi.org\/project\/python-dotenv\/\" target=\"_blank\">python-dotenv<\/a><\/strong>, which is able to load surroundings variables from a <code style=\"background: #F5F5F5;\">.env<\/code> file.<\/p>\n<p>That&#8217;s it \u2014 all with only one command. This simplicity is a core a part of the smolagents philosophy.<\/p>\n<p>\u00a0<\/p>\n<p><center><img decoding=\"async\" alt=\"Installing smolagents\" width=\"100%\" class=\"perfmatters-lazy\" src=\"https:\/\/www.kdnuggets.com\/wp-content\/uploads\/smolgents11.png\"\/><br \/><span>Determine 1: Putting in smolagents<\/span><\/center><br \/>\n\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Setting Up Your API Token<\/h4>\n<p>Create a <strong>.env<\/strong> file in your venture root and paste this code. Please substitute the placeholder together with your precise token:<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>HF_TOKEN=your_huggingface_token_here<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>Get your token from <strong><a rel=\"nofollow\" target=\"_blank\" href=\"http:\/\/huggingface.co\/settings\/tokens\" target=\"_blank\">huggingface.co\/settings\/tokens<\/a><\/strong>. Your venture construction ought to appear like this:<\/p>\n<p>\u00a0<\/p>\n<p><center><img decoding=\"async\" alt=\"Project structure\" width=\"100%\" class=\"perfmatters-lazy\" src=\"https:\/\/www.kdnuggets.com\/wp-content\/uploads\/Screenshot-2026-03-09-at-16.41.58.png\"\/><br \/><span>Determine 2: Venture construction<\/span><\/center><br \/>\n\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Importing Libraries<\/h4>\n<p>Open your <code style=\"background: #F5F5F5;\">demo.py<\/code> file and paste the next code:<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>import requests&#13;\nimport os&#13;\nfrom smolagents import instrument, CodeAgent, InferenceClientModel<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<ul>\n<li><code style=\"background: #F5F5F5;\">requests<\/code>: For making HTTP calls to the climate API<\/li>\n<li><code style=\"background: #F5F5F5;\">os<\/code>: To securely learn surroundings variables<\/li>\n<li><code style=\"background: #F5F5F5;\">smolagents<\/code>: Hugging Face&#8217;s light-weight agent framework offering:\n<ul>\n<li><code style=\"background: #F5F5F5;\">@instrument<\/code>: A decorator to outline agent-callable features.<\/li>\n<li><code style=\"background: #F5F5F5;\">CodeAgent<\/code>: An agent that writes and executes Python code.<\/li>\n<li><code style=\"background: #F5F5F5;\">InferenceClientModel<\/code>: Connects to Hugging Face&#8217;s hosted LLMs.<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p>In smolagents, defining a instrument is easy. We&#8217;ll create a perform that takes a metropolis identify as enter and returns the climate situation. Add the next code to your <code style=\"background: #F5F5F5;\">demo.py<\/code> file:<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>@instrument&#13;\ndef get_weather(metropolis: str) -&gt; str:&#13;\n    \"\"\"&#13;\n    Returns the present climate forecast for a specified metropolis.&#13;\n    Args:&#13;\n        metropolis: The identify of the town to get the climate for.&#13;\n    \"\"\"&#13;\n    # Utilizing wttr.wherein is a stunning free climate service&#13;\n    response = requests.get(f\"https:\/\/wttr.in\/{metropolis}?format=%C+%t\")&#13;\n    if response.status_code == 200:&#13;\n        # The response is apparent textual content like \"Partly cloudy +15\u00b0C\"&#13;\n        return f\"The climate in {metropolis} is: {response.textual content.strip()}\"&#13;\n    else:&#13;\n        return \"Sorry, I could not fetch the climate knowledge.\"<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>Let&#8217;s break this down:<\/p>\n<ul>\n<li>We import the <code style=\"background: #F5F5F5;\">instrument<\/code> decorator from smolagents. This decorator transforms our common Python perform right into a instrument that the agent can perceive and use.<\/li>\n<li>The docstring (<code style=\"background: #F5F5F5;\">\"\"\" ... \"\"\"<\/code>) within the <code style=\"background: #F5F5F5;\">get_weather<\/code> perform is vital. The agent reads this description to know what the instrument does and the best way to use it.<\/li>\n<li>Contained in the perform, we make a easy HTTP request to <strong>wttr.in<\/strong>, a free climate service that returns plain-text forecasts.<\/li>\n<li>Kind hints (<code style=\"background: #F5F5F5;\">metropolis: str<\/code>) inform the agent what inputs to offer.<\/li>\n<\/ul>\n<p>This can be a excellent instance of instrument calling in motion. We&#8217;re giving the agent a brand new functionality.<\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Configuring the LLM<\/h4>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>hf_token = os.getenv(\"HF_TOKEN\")&#13;\nif hf_token is None:&#13;\n    increase ValueError(\"Please set the HF_TOKEN surroundings variable\")&#13;\n&#13;\nmannequin = InferenceClientModel(&#13;\n    model_id=\"Qwen\/Qwen2.5-Coder-32B-Instruct\",&#13;\n    token=hf_token&#13;\n)<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>The agent wants a mind \u2014 a big language mannequin (LLM) that may cause about duties. Right here we use:<\/p>\n<ul>\n<li><code style=\"background: #F5F5F5;\">Qwen2.5-Coder-32B-Instruct<\/code>: A strong code-focused mannequin hosted on Hugging Face<\/li>\n<li><code style=\"background: #F5F5F5;\">HF_TOKEN<\/code>: Your Hugging Face API token, saved in a <code style=\"background: #F5F5F5;\">.env<\/code> file for safety<\/li>\n<\/ul>\n<p>Now, we have to create the agent itself.<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>agent = CodeAgent(&#13;\n    instruments=[get_weather],&#13;\n    mannequin=mannequin,&#13;\n    add_base_tools=False&#13;\n)<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p><code style=\"background: #F5F5F5;\">CodeAgent<\/code> is a particular agent sort that:<\/p>\n<ul>\n<li>Writes Python code to resolve issues<\/li>\n<li>Executes that code in a sandboxed surroundings<\/li>\n<li>Can chain a number of instrument calls collectively<\/li>\n<\/ul>\n<p>Right here, we&#8217;re instantiating a <code style=\"background: #F5F5F5;\">CodeAgent<\/code>. We move it an inventory containing our <code style=\"background: #F5F5F5;\">get_weather<\/code> instrument and the mannequin object. The <code style=\"background: #F5F5F5;\">add_base_tools=False<\/code> argument tells it to not embody any default instruments, retaining our agent easy for now.<\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Operating the Agent<\/h4>\n<p>That is the thrilling half. Let&#8217;s give our agent a job. Run the agent with a particular immediate:<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>response = agent.run(&#13;\n    \"Are you able to inform me the climate in Paris and in addition in Tokyo?\"&#13;\n)&#13;\nprint(response)<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>Once you name <code style=\"background: #F5F5F5;\">agent.run()<\/code>, the agent:<\/p>\n<ol>\n<li>Reads your immediate.<\/li>\n<li>Causes about what instruments it wants.<\/li>\n<li>Generates code that calls <code style=\"background: #F5F5F5;\">get_weather(\"Paris\")<\/code> and <code style=\"background: #F5F5F5;\">get_weather(\"Tokyo\")<\/code>.<\/li>\n<li>Executes the code and returns the outcomes.<\/li>\n<\/ol>\n<p>\u00a0<\/p>\n<p><center><img decoding=\"async\" alt=\"smolagents response\" width=\"100%\" class=\"perfmatters-lazy\" src=\"https:\/\/www.kdnuggets.com\/wp-content\/uploads\/smolgents-demo-1.png\"\/><br \/><span>Determine 3: smolagents response<\/span><\/center><br \/>\n\u00a0<\/p>\n<p>Once you run this code, you&#8217;ll witness the magic of a Hugging Face agent. The agent receives your request. It sees that it has a instrument referred to as <code style=\"background: #F5F5F5;\">get_weather<\/code>. It then writes a small Python script in its &#8220;thoughts&#8221; (utilizing the LLM) that appears one thing like this:<\/p>\n<p>\u00a0<\/p>\n<blockquote>\n<p>\nThat is what the agent thinks, not code you write.\n<\/p>\n<\/blockquote>\n<p>\u00a0<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>weather_paris = get_weather(metropolis=\"Paris\")&#13;\nweather_tokyo = get_weather(metropolis=\"Tokyo\")&#13;\nfinal_answer(f\"Right here is the climate: {weather_paris} and {weather_tokyo}\")<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p><center><img decoding=\"async\" alt=\"smolagents final response\" width=\"100%\" class=\"perfmatters-lazy\" src=\"https:\/\/www.kdnuggets.com\/wp-content\/uploads\/smolgents-demo-2.png\"\/><br \/><span>Determine 4: smolagents last response<\/span><\/center><br \/>\n\u00a0<\/p>\n<p>It executes this code, fetches the information, and returns a pleasant reply. You may have simply constructed a code agent that may browse the online through APIs.<\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>How It Works Behind the Scenes<\/h4>\n<p>\u00a0<\/p>\n<p><center><img decoding=\"async\" alt=\"The inner workings of an AI code agent\" width=\"100%\" class=\"perfmatters-lazy\" src=\"https:\/\/www.kdnuggets.com\/wp-content\/uploads\/the-inner-workings-of-an-AI-code-agent.png\"\/><br \/><span>Determine 5: The interior workings of an AI code agent<\/span><\/center><\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>Taking It Additional: Including Extra Instruments<\/h4>\n<p>The ability of brokers grows with their toolkit. What if we needed to save lots of the climate report back to a file? We are able to create one other instrument.<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>@instrument&#13;\ndef save_to_file(content material: str, filename: str = \"weather_report.txt\") -&gt; str:&#13;\n    \"\"\"&#13;\n    Saves the supplied textual content content material to a file.&#13;\n    Args:&#13;\n        content material: The textual content content material to save lots of.&#13;\n        filename: The identify of the file to save lots of to (default: weather_report.txt).&#13;\n    \"\"\"&#13;\n    with open(filename, \"w\") as f:&#13;\n        f.write(content material)&#13;\n    return f\"Content material efficiently saved to {filename}\"&#13;\n&#13;\n# Re-initialize the agent with each instruments&#13;\nagent = CodeAgent(&#13;\n    instruments=[get_weather, save_to_file],&#13;\n    mannequin=mannequin,&#13;\n)<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<div style=\"width: 98%; overflow: auto; padding-left: 10px; padding-bottom: 10px; padding-top: 10px; background: #F5F5F5;\">\n<pre><code>agent.run(\"Get the climate for London and save the report back to a file referred to as london_weather.txt\")<\/code><\/pre>\n<\/div>\n<p>\u00a0<\/p>\n<p>Now, your agent can fetch knowledge and work together together with your native file system. This mixture of abilities is what makes Python AI brokers so versatile.<\/p>\n<p>\u00a0<\/p>\n<h2><span>#\u00a0<\/span>Conclusion<\/h2>\n<p>\u00a0<br \/>In only a few minutes and with fewer than 20 strains of core logic, you could have constructed a practical AI agent. We&#8217;ve got seen how smolagents simplifies the method of making code brokers that write and execute Python to resolve issues.<\/p>\n<p>The great thing about this open-source agent framework is that it removes the boilerplate, permitting you to give attention to the enjoyable half: constructing the instruments and defining the duties. You might be now not simply chatting with an AI; you&#8217;re collaborating with one that may act. That is only the start. Now you can discover giving your agent entry to the web through search APIs, hook it as much as a database, or let it management an online browser.<\/p>\n<p>\u00a0<\/p>\n<h4><span>\/\/\u00a0<\/span>References and Studying Assets<\/h4>\n<p>\u00a0<br \/>\u00a0<\/p>\n<p><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/www.linkedin.com\/in\/olumide-shittu\"><strong><strong><a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/www.linkedin.com\/in\/olumide-shittu\/\" target=\"_blank\" rel=\"noopener noreferrer\">Shittu Olumide<\/a><\/strong><\/strong><\/a> is a software program engineer and technical author captivated with leveraging cutting-edge applied sciences to craft compelling narratives, with a eager eye for element and a knack for simplifying advanced ideas. It&#8217;s also possible to discover Shittu on <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/twitter.com\/Shittu_Olumide_\">Twitter<\/a>.<\/p>\n<\/p><\/div>\n<p><template id="KHgIE2nzLMIkRpVtFXeZ"></template><\/script><br \/>\n<br \/><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Picture by Writer \u00a0 #\u00a0Introduction \u00a0AI has moved from merely chatting with massive language fashions (LLMs) to giving them legs and arms, which permits them to carry out actions within the digital world. These are sometimes referred to as Python AI brokers \u2014 autonomous software program packages powered by LLMs that may understand their surroundings, [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":13133,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[55],"tags":[75,73,977,2549,8395,2296],"class_list":["post-13131","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-machine-learning","tag-agent","tag-build","tag-code","tag-minutes","tag-smolagents","tag-started"],"_links":{"self":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/13131","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=13131"}],"version-history":[{"count":1,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/13131\/revisions"}],"predecessor-version":[{"id":13132,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/13131\/revisions\/13132"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/media\/13133"}],"wp:attachment":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=13131"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=13131"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=13131"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}<!-- This website is optimized by Airlift. Learn more: https://airlift.net. Template:. Learn more: https://airlift.net. Template: 69d9690a190636c2e0989534. Config Timestamp: 2026-04-10 21:18:02 UTC, Cached Timestamp: 2026-05-12 03:57:26 UTC -->