{"id":6758,"date":"2025-09-17T17:51:55","date_gmt":"2025-09-17T17:51:55","guid":{"rendered":"https:\/\/techtrendfeed.com\/?p=6758"},"modified":"2025-09-17T17:51:55","modified_gmt":"2025-09-17T17:51:55","slug":"streamline-entry-to-iso-rating-content-material-modifications-with-verisk-ranking-insights-and-amazon-bedrock","status":"publish","type":"post","link":"https:\/\/techtrendfeed.com\/?p=6758","title":{"rendered":"Streamline entry to ISO-rating content material modifications with Verisk ranking insights and Amazon Bedrock"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div id=\"\">\n<p><em>This publish is co-written with Samit Verma, Eusha Rizvi, Manmeet Singh, Troy Smith, and Corey Finley from Verisk.<\/em><\/p>\n<p>Verisk Ranking Insights as a function of <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/www.verisk.com\/products\/erc\/\" target=\"_blank\" rel=\"noopener noreferrer\">ISO Digital Ranking Content material<\/a> (ERC) is a strong device designed to offer summaries of ISO Ranking modifications between two releases. Historically, extracting particular submitting info or figuring out variations throughout a number of releases required guide downloads of full packages, which was time-consuming and liable to inefficiencies. This problem, coupled with the necessity for correct and well timed buyer help, prompted <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/www.verisk.com\/\" target=\"_blank\" rel=\"noopener noreferrer\">Verisk<\/a> to discover modern methods to boost person accessibility and automate repetitive processes. Utilizing <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/generative-ai\/\" target=\"_blank\" rel=\"noopener noreferrer\">generative AI<\/a> and <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Internet Companies<\/a> (AWS) providers, Verisk has made vital strides in making a conversational person interface for customers to simply retrieve particular info, determine content material variations, and enhance general operational effectivity.<\/p>\n<p>On this publish, we dive into how Verisk Ranking Insights, powered by <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/bedrock\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Bedrock<\/a>, <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/what-is\/large-language-model\/\" target=\"_blank\" rel=\"noopener noreferrer\">giant language fashions<\/a> (LLM), and <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/what-is\/retrieval-augmented-generation\/\" target=\"_blank\" rel=\"noopener noreferrer\">Retrieval Augmented Technology<\/a> (RAG), is reworking the best way clients work together with and entry ISO ERC modifications.<\/p>\n<h2>The problem<\/h2>\n<p>Ranking Insights supplies precious content material, however there have been vital challenges with person accessibility and the time it took to extract actionable insights:<\/p>\n<ol>\n<li><strong>Guide downloading<\/strong> \u2013 Clients needed to obtain complete packages to get even a small piece of related info. This was inefficient, particularly when solely part of the submitting wanted to be reviewed.<\/li>\n<li><strong>Inefficient knowledge retrieval<\/strong> \u2013 Customers couldn\u2019t shortly determine the variations between two content material packages with out downloading and manually evaluating them, which might take hours and typically days of study.<\/li>\n<li><strong>Time-consuming buyer help<\/strong> \u2013 Verisk\u2019s ERC Buyer Help workforce spent 15% of their time weekly addressing queries from clients who had been impacted by these inefficiencies. Moreover, onboarding new clients required half a day of repetitive coaching to make sure they understood how one can entry and interpret the information.<\/li>\n<li><strong>Guide evaluation time<\/strong> \u2013 Clients usually spent 3\u20134 hours per take a look at case analyzing the variations between filings. With a number of take a look at circumstances to deal with, this led to vital delays in important decision-making.<\/li>\n<\/ol>\n<h2>Answer overview<\/h2>\n<p>To unravel these challenges, Verisk launched into a journey to boost Ranking Insights with generative AI applied sciences. By integrating <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/bedrock\/claude\/\" target=\"_blank\" rel=\"noopener noreferrer\">Anthropic\u2019s Claude<\/a>, accessible in Amazon Bedrock, and <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/opensearch-service\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon OpenSearch Service<\/a>, Verisk created a complicated conversational platform the place customers can effortlessly entry and analyze ranking content material modifications.<\/p>\n<p>The next diagram illustrates the high-level structure of the answer, with distinct sections displaying the information ingestion course of and inference loop. The structure makes use of a number of AWS providers so as to add generative AI capabilities to the Rankings Perception system. This method\u2019s parts work collectively seamlessly, coordinating a number of LLM calls to generate person responses.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-116313 size-full\" style=\"margin: 10px 0px 10px 0px;border: 1px solid #CCCCCC\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/ML-18632-complete-arch.png\" alt=\"End-to-end AWS conversational AI system with UI, ingestion, response evaluation, and analytics components integrated via serverless services\" width=\"954\" height=\"1066\"\/><\/p>\n<p>The next diagram reveals the architectural parts and the high-level steps concerned within the Information Ingestion course of.<\/p>\n<table style=\"width: 50%\">\n<tbody>\n<tr>\n<td><img decoding=\"async\" loading=\"lazy\" class=\"alignnone wp-image-116316 size-full\" style=\"margin: 10px 0px 10px 0px;border: 1px solid #CCCCCC\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/ML-18632-data-ingestion.png\" alt=\"AWS document processing architecture showing rating data ingestion flow through Lambda, embedding model, and OpenSearch service\" width=\"918\" height=\"991\"\/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>The steps within the knowledge ingestion course of proceed as follows:<\/p>\n<ol>\n<li>This course of is triggered when a brand new file is dropped. It&#8217;s liable for chunking the doc utilizing a {custom} chunking technique. This technique recursively checks every part and retains them intact with out overlap. The method then embeds the chunks and shops them in OpenSearch Service as vector embeddings.<\/li>\n<li>The embedding mannequin utilized in Amazon Bedrock is amazon titan-embed-g1-text-02.<\/li>\n<li>Amazon OpenSearch Serverless is utilized as a vector embedding retailer with metadata filtering functionality.<\/li>\n<\/ol>\n<p>The next diagram reveals the architectural parts and the high-level steps concerned within the inference loop to generate person responses.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"alignnone wp-image-116318 size-full\" style=\"margin: 10px 0px 10px 0px;border: 1px solid #CCCCCC\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/ML-18632-inference.png\" alt=\"AWS chat system architecture with user feedback, Verisk gateway, load balancing, caching, and dual AI model integration\" width=\"980\" height=\"440\"\/><\/p>\n<p>The steps within the inference loop proceed as follows:<\/p>\n<ol>\n<li>This element is liable for a number of duties: it dietary supplements person questions with current chat historical past, embeds the questions, retrieves related chunks from the vector database, and at last calls the technology mannequin to synthesize a response.<\/li>\n<li>Amazon ElastiCache is used for storing current chat historical past.<\/li>\n<li>The embedding mannequin utilized in Amazon Bedrock is amazon titan-embed-g1-text-02.<\/li>\n<li>OpenSearch Serverless is applied for RAG (Retrieval-Augmented Technology).<\/li>\n<li>For producing responses to person queries, the system makes use of Anthropic\u2019s Claude Sonnet 3.5 (mannequin ID: anthropic.claude-3-5-sonnet-20240620-v1:0), which is obtainable via Amazon Bedrock.<\/li>\n<\/ol>\n<h2>Key applied sciences and frameworks used<\/h2>\n<p>We used Anthropic\u2019s Claude Sonnet 3.5 (mannequin ID: anthropic.claude-3-5-sonnet-20240620-v1:0) to grasp person enter and supply detailed, contextually related responses. Anthropic\u2019s Claude Sonnet 3.5 enhances the platform\u2019s potential to interpret person queries and ship correct insights from complicated content material modifications. LlamaIndex, which is an open supply framework, served because the chain framework for effectively connecting and managing completely different knowledge sources to allow dynamic retrieval of content material and insights.<\/p>\n<p>We applied RAG, which permits the mannequin to drag particular, related knowledge from the OpenSearch Serverless vector database. This implies the system generates exact, up-to-date responses primarily based on a person\u2019s question while not having to sift via large content material downloads. The vector database permits clever search and retrieval, organizing content material modifications in a means that makes them shortly and simply accessible. This eliminates the necessity for guide looking or downloading of complete content material packages. Verisk utilized guardrails in <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/bedrock\/guardrails\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Bedrock Guardrails<\/a> together with {custom} guardrails across the generative mannequin so the output adheres to particular compliance and high quality requirements, safeguarding the integrity of responses.<\/p>\n<p>Verisk\u2019s generative AI resolution is a complete, safe, and versatile service for constructing generative AI purposes and brokers. Amazon Bedrock connects you to main FMs, providers to deploy and function brokers, and instruments for fine-tuning, safeguarding, and optimizing fashions together with data bases to attach purposes to your newest knowledge so that you&#8217;ve got the whole lot you must shortly transfer from experimentation to real-world deployment.<\/p>\n<p>Given the novelty of generative AI, Verisk has established a governance council to supervise its options, making certain they meet safety, compliance, and knowledge utilization requirements. Verisk applied strict controls throughout the RAG pipeline to make sure knowledge is barely accessible to licensed customers. This helps preserve the integrity and privateness of delicate info. Authorized critiques guarantee IP safety and contract compliance.<\/p>\n<h2>The way it works<\/h2>\n<p>The mixing of those superior applied sciences permits a seamless, user-friendly expertise. Right here\u2019s how Verisk Ranking Insights now works for purchasers:<\/p>\n<ol>\n<li><strong>Conversational person interface<\/strong> \u2013 Customers can work together with the platform through the use of a conversational interface. As a substitute of manually reviewing content material packages, customers enter a pure language question (for instance, \u201cWhat are the modifications in protection scope between the 2 current filings?\u201d). The system makes use of Anthropic\u2019s Claude Sonnet 3.5 to grasp the intent and supplies an immediate abstract of the related modifications.<\/li>\n<li><strong>Dynamic content material retrieval<\/strong> \u2013 Because of RAG and OpenSearch Service, the platform doesn\u2019t require downloading complete recordsdata. As a substitute, it dynamically retrieves and presents the particular modifications a person is in search of, enabling faster evaluation and decision-making.<\/li>\n<li><strong>Automated distinction evaluation<\/strong> \u2013 The system can routinely evaluate two content material packages, highlighting the variations with out requiring guide intervention. Customers can question for exact comparisons (for instance, \u201cPresent me the variations in ranking standards between Launch 1 and Launch 2\u201d).<\/li>\n<li><strong>Personalized insights<\/strong> \u2013 The guardrails in place imply that responses are correct, compliant, and actionable. Moreover, if wanted, the system may help customers perceive the influence of modifications and help them in navigating the complexities of filings, offering clear, concise insights.<\/li>\n<\/ol>\n<p>The next diagram reveals the architectural parts and the high-level steps concerned within the analysis loop to generate related and grounded responses.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"alignnone wp-image-116317 size-full\" style=\"margin: 10px 0px 10px 0px;border: 1px solid #CCCCCC\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/ML-18632-evaluation.png\" alt=\"Detailed AWS AI system showing user queries, generation model, response evaluation API, and result storage in S3 bucket\" width=\"1000\" height=\"728\"\/><\/p>\n<p>The steps within the analysis loop proceed as follows:<\/p>\n<ol>\n<li>This element is liable for calling Anthropic\u2019s Claude Sonnet 3.5 mannequin and subsequently invoking the custom-built analysis APIs to make sure response accuracy.<\/li>\n<li>The technology mannequin employed is Anthropic\u2019s Claude Sonnet 3.5, which handles the creation of responses.<\/li>\n<li>The Analysis API ensures that responses stay related to person queries and keep grounded throughout the offered context.<\/li>\n<\/ol>\n<p>The next diagram reveals the method of capturing the chat historical past as contextual reminiscence and storage for evaluation.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"alignnone wp-image-116314 size-full\" style=\"margin: 10px 0px 10px 0px;border: 1px solid #CCCCCC\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/ML-18632-chat-history.png\" alt=\"AWS serverless chat analysis pipeline: Lambda for backup, S3 for storage, Snowflake for data warehousing, and dashboard visualization\" width=\"467\" height=\"498\"\/><\/p>\n<h2>High quality benchmarks<\/h2>\n<p>The Verisk Ranking Insights workforce has applied a complete analysis framework and suggestions loop mechanism respectively, proven within the above figures, to help steady enchancment and tackle the problems that may come up.<\/p>\n<p>Making certain excessive accuracy and consistency in responses is important for Verisk\u2019s generative AI options. Nevertheless, LLMs can typically produce hallucinations or present irrelevant particulars, affecting reliability. To deal with this, Verisk applied:<\/p>\n<ul>\n<li><strong>Analysis framework<\/strong> \u2013 Built-in into the question pipeline, it validates responses for precision and relevance earlier than supply.<\/li>\n<li><strong>In depth testing<\/strong> \u2013 Product material specialists (SMEs) and high quality specialists rigorously examined the answer to make sure accuracy and reliability. Verisk collaborated with in-house insurance coverage area specialists to develop SME analysis metrics for accuracy and consistency. A number of rounds of SME evaluations had been performed, the place specialists graded these metrics on a 1\u201310 scale. Latency was additionally tracked to evaluate pace. Suggestions from every spherical was integrated into subsequent checks to drive enhancements.<\/li>\n<li><strong>Continuous mannequin enchancment<\/strong> \u2013 Utilizing buyer suggestions serves as an important element in driving the continual evolution and refinement of the generative fashions, enhancing each accuracy and relevance. By seamlessly integrating person interactions and suggestions with chat historical past, a strong knowledge pipeline is created that streams the person interactions to an <a rel=\"nofollow\" target=\"_blank\" href=\"https:\/\/aws.amazon.com\/s3\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Easy Storage Service<\/a> (Amazon S3) bucket, which acts as a knowledge hub. The interactions then go into Snowflake, which is a cloud-based knowledge platform and knowledge warehouse as a service that gives capabilities comparable to knowledge warehousing, knowledge lakes, knowledge sharing, and knowledge change. Via this integration, we constructed complete analytics dashboards that present precious insights into person expertise patterns and ache factors.<\/li>\n<\/ul>\n<p>Though the preliminary outcomes had been promising, they didn\u2019t meet the specified accuracy and consistency ranges. The event course of concerned a number of iterative enhancements, comparable to redesigning the system and making a number of calls to the LLM. The first metric for achievement was a guide grading system the place enterprise specialists in contrast the outcomes and offered steady suggestions to enhance general benchmarks.<\/p>\n<h2>Enterprise influence and alternative<\/h2>\n<p>By integrating generative AI into Verisk Ranking Insights, the enterprise has seen a outstanding transformation. Clients loved vital time financial savings. By eliminating the necessity to obtain complete packages and manually seek for variations, the time spent on evaluation has been drastically decreased. Clients not spend 3\u20134 hours per take a look at case. What at one time took days now takes minutes.<\/p>\n<p>This time financial savings introduced elevated productiveness. With an automatic resolution that immediately supplies related insights, clients can focus extra on decision-making somewhat than spending time on guide knowledge retrieval. And by automating distinction evaluation and offering a centralized, easy platform, clients will be extra assured within the accuracy of their outcomes and keep away from lacking important modifications.<\/p>\n<p>For Verisk, the profit was a decreased buyer help burden as a result of the ERC buyer help workforce now spends much less time addressing queries. With the AI-powered conversational interface, customers can self-serve and get solutions in actual time, liberating up help assets for extra complicated inquiries.<\/p>\n<p>The automation of repetitive coaching duties meant faster and extra environment friendly buyer onboarding. This reduces the necessity for prolonged coaching periods, and new clients develop into proficient sooner. The mixing of generative AI has decreased redundant workflows and the necessity for guide intervention. This streamlines operations throughout a number of departments, resulting in a extra agile and responsive enterprise.<\/p>\n<h2>Conclusion<\/h2>\n<p>Wanting forward, Verisk plans to proceed enhancing the Ranking Insights platform twofold. First, we\u2019ll develop the scope of queries, enabling extra refined queries associated to completely different submitting varieties and extra nuanced protection areas. Second, we\u2019ll scale the platform. With Amazon Bedrock offering the infrastructure, Verisk goals to scale this resolution additional to help extra customers and extra content material units throughout varied product traces.<\/p>\n<p>Verisk Ranking Insights, now powered by generative AI and AWS applied sciences, has reworked the best way clients work together with and entry ranking content material modifications. Via a conversational person interface, RAG, and vector databases, Verisk intends to get rid of inefficiencies and save clients precious time and assets whereas enhancing general accessibility. For Verisk, this resolution has improved operational effectivity and offered a robust basis for continued innovation.<\/p>\n<p>With Amazon Bedrock and a deal with automation, Verisk is driving the way forward for clever buyer help and content material administration, empowering each their clients and their inner groups to make smarter, sooner selections.<\/p>\n<p>For extra info, consult with the next assets:<\/p>\n<hr\/>\n<h3>Concerning the authors<\/h3>\n<p style=\"clear: both\"><img decoding=\"async\" loading=\"lazy\" class=\"wp-image-116324 size-thumbnail alignleft\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-6-5-100x129.png\" alt=\"\" width=\"100\" height=\"129\"\/><strong>Samit Verma<\/strong> serves because the Director of Software program Engineering at Verisk, overseeing the Ranking and Protection improvement groups. On this position, he performs a key half in architectural design and supplies strategic course to a number of improvement groups, enhancing effectivity and making certain long-term resolution maintainability. He holds a grasp\u2019s diploma in info know-how.<\/p>\n<p style=\"clear: both\"><img decoding=\"async\" loading=\"lazy\" class=\"alignleft wp-image-116325 size-thumbnail\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-7-3-100x129.jpeg\" alt=\"\" width=\"100\" height=\"129\"\/><strong>Eusha Rizvi<\/strong> serves as a Software program Improvement Supervisor at Verisk, main a number of know-how groups throughout the Rankings Merchandise division. Possessing robust experience in system design, structure, and engineering, Eusha presents important steering that advances the event of modern options. He holds a bachelor\u2019s diploma in info techniques from Stony Brook College.<\/p>\n<p style=\"clear: both\"><strong><img decoding=\"async\" loading=\"lazy\" class=\"alignleft wp-image-116326 size-thumbnail\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-8-5-100x122.png\" alt=\"\" width=\"100\" height=\"122\"\/>Manmeet Singh<\/strong> is a Software program Engineering Lead at Verisk and AWS Licensed Generative AI Specialist. He leads the event of an agentic RAG-based generative AI system on Amazon Bedrock, with experience in LLM orchestration, immediate engineering, vector databases, microservices, and high-availability structure. Manmeet is obsessed with making use of superior AI and cloud applied sciences to ship resilient, scalable, and business-critical techniques.<\/p>\n<p style=\"clear: both\"><strong><img decoding=\"async\" loading=\"lazy\" class=\"wp-image-116331 size-thumbnail alignleft\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-9-4-100x113.png\" alt=\"\" width=\"100\" height=\"113\"\/>Troy Smith<\/strong> is a Vice President of Ranking Options at Verisk. Troy is a seasoned insurance coverage know-how chief with greater than 25 years of expertise in ranking, pricing, and product technique. At Verisk, he leads the workforce behind ISO Digital Ranking Content material, a broadly used useful resource throughout the insurance coverage trade. Troy has held management roles at Earnix and Capgemini and was the cofounder and unique creator of the Oracle Insbridge Ranking Engine.<\/p>\n<p style=\"clear: both\"><strong><img decoding=\"async\" loading=\"lazy\" class=\"alignleft wp-image-116328 size-thumbnail\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-10-2-100x106.jpeg\" alt=\"\" width=\"100\" height=\"106\"\/>Corey Finley<\/strong> is a Product Supervisor at Verisk. Corey has over 22 years of expertise throughout private and business traces of insurance coverage. He has labored in each implementation and product help roles and has led efforts for main carriers together with Allianz, CNA, Residents, and others. At Verisk, he serves as Product Supervisor for VRI, RaaS, and ERC.<\/p>\n<p style=\"clear: both\"><strong><img decoding=\"async\" loading=\"lazy\" class=\"alignleft wp-image-116329 size-thumbnail\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-11-3-100x123.png\" alt=\"\" width=\"100\" height=\"123\"\/>Arun Pradeep Selvaraj<\/strong> is a Senior Options Architect at Amazon Internet Companies (AWS). Arun is obsessed with working along with his clients and stakeholders on digital transformations and innovation within the cloud whereas persevering with to be taught, construct, and reinvent. He&#8217;s artistic, energetic, deeply customer-obsessed, and makes use of the working backward course of to construct trendy architectures to assist clients clear up their distinctive challenges. Join with him on LinkedIn.<\/p>\n<p style=\"clear: both\"><strong><img decoding=\"async\" loading=\"lazy\" class=\"alignleft wp-image-116330 size-thumbnail\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2025\/09\/12\/image-12-2-100x148.png\" alt=\"\" width=\"100\" height=\"148\"\/>Ryan Doty<\/strong> is a Options Architect Supervisor at Amazon Internet Companies (AWS), primarily based out of New York. He helps monetary providers clients speed up their adoption of the AWS Cloud by offering architectural pointers to design modern and scalable options. Coming from a software program improvement and gross sales engineering background, the chances that the cloud can convey to the world excite him.<\/p>\n<p>       \n      <\/div>\n\n","protected":false},"excerpt":{"rendered":"<p>This publish is co-written with Samit Verma, Eusha Rizvi, Manmeet Singh, Troy Smith, and Corey Finley from Verisk. Verisk Ranking Insights as a function of ISO Digital Ranking Content material (ERC) is a strong device designed to offer summaries of ISO Ranking modifications between two releases. Historically, extracting particular submitting info or figuring out variations [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":6760,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[55],"tags":[539,387,1289,2177,3010,5397,3457,5396,5398],"class_list":["post-6758","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-machine-learning","tag-access","tag-amazon","tag-bedrock","tag-content","tag-insights","tag-isorating","tag-rating","tag-streamline","tag-verisk"],"_links":{"self":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/6758","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6758"}],"version-history":[{"count":1,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/6758\/revisions"}],"predecessor-version":[{"id":6759,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/posts\/6758\/revisions\/6759"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=\/wp\/v2\/media\/6760"}],"wp:attachment":[{"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6758"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6758"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/techtrendfeed.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6758"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}<!-- This website is optimized by Airlift. Learn more: https://airlift.net. Template:. Learn more: https://airlift.net. Template: 69d9690a190636c2e0989534. Config Timestamp: 2026-04-10 21:18:02 UTC, Cached Timestamp: 2026-05-14 08:53:49 UTC -->