Close Menu
    Latest Post

    Anker’s X1 Pro shouldn’t exist, but I’m so glad it does

    February 22, 2026

    Suspected Russian Actor Linked to CANFAIL Malware Attacks on Ukrainian Organizations

    February 22, 2026

    Trump Reinstates De Minimis Exemption Suspension Despite Supreme Court Ruling

    February 22, 2026
    Facebook X (Twitter) Instagram
    Trending
    • Anker’s X1 Pro shouldn’t exist, but I’m so glad it does
    • Suspected Russian Actor Linked to CANFAIL Malware Attacks on Ukrainian Organizations
    • Trump Reinstates De Minimis Exemption Suspension Despite Supreme Court Ruling
    • How Cloudflare Mitigated a Vulnerability in its ACME Validation Logic
    • Demis Hassabis and John Jumper Receive Nobel Prize in Chemistry
    • How to Cancel Your Google Pixel Watch Fitbit Premium Trial
    • GHD Speed Hair Dryer Review: Powerful Performance and User-Friendly Design
    • An FBI ‘Asset’ Helped Run a Dark Web Site That Sold Fentanyl-Laced Drugs for Years
    Facebook X (Twitter) Instagram Pinterest Vimeo
    NodeTodayNodeToday
    • Home
    • AI
    • Dev
    • Guides
    • Products
    • Security
    • Startups
    • Tech
    • Tools
    NodeTodayNodeToday
    Home»AI»How AutoScout24 built a Bot Factory to standardize AI agent development with Amazon Bedrock
    AI

    How AutoScout24 built a Bot Factory to standardize AI agent development with Amazon Bedrock

    Samuel AlejandroBy Samuel AlejandroJanuary 16, 2026No Comments11 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    src ta38jq featured
    Share
    Facebook Twitter LinkedIn Pinterest Email

    AutoScout24, a leading European automotive marketplace, aims to establish a “Bot Factory.” This centralized framework is designed to create and deploy artificial intelligence (AI) agents capable of performing tasks and making decisions within various workflows, significantly boosting operational efficiency across the organization.

    From Disparate Experiments to a Standardized Framework

    The emergence of powerful generative AI agents, which can reason, plan, and act, presented a clear opportunity for AutoScout24 to enhance internal productivity. This led to various engineering teams exploring the technology. Recognizing the rapid pace of AI innovation, the company saw a chance to develop a standardized approach for AI development. While successful experiments had been conducted using various tools and frameworks on Amazon Web Services (AWS), the vision was to create a unified, enterprise-grade framework to accelerate innovation. The goal was to provide an easy path for teams across the organization to build secure, scalable, and maintainable AI agents. The AutoScout24 AI Platform Engineering team collaborated with the AWS Prototype and Cloud Engineering (PACE) team in a three-week AI bootcamp. The objective was to transition from fragmented experiments to a cohesive strategy by developing a reusable blueprint, known as the Bot Factory, to standardize the creation and operation of future AI agents within the company.

    The Challenge: Identifying a High-Impact Use Case

    To anchor the Bot Factory blueprint in a practical business scenario, the team focused on a significant operational cost: internal developer support. The problem was clear: AutoScout24 AI Platform engineers were spending up to 30% of their time on repetitive tasks such as answering questions, granting tool access, and locating documentation. This support burden reduced overall productivity, diverting skilled engineers from critical feature development and causing other developers to wait for routine requests. An automated support bot was identified as an ideal initial use case because it required two core agent functionalities:

    1. Knowledge retrieval: Answering “how-to” questions by searching internal documentation, a capability known as Retrieval Augmented Generation (RAG).
    2. Action execution: Performing tasks in other systems, such as assigning a GitHub Copilot license, which necessitates secure API integration, or “tool use.”

    By developing a bot capable of both, the team could validate the blueprint while delivering immediate business value.

    Architectural Overview

    This section explores the architecture AutoScout24 utilized to build its standardized AI development framework, enabling the rapid deployment of secure and scalable AI agents.

    Architecture diagram showing AgentCore Runtime system workflow from Slack user interaction through AWS services to specialized worker agents accessing GitHub and Amazon Bedrock KnowledgeBase.

    The architecture features a simple, decoupled flow designed for resilience and ease of maintenance. The diagram presents a simplified view focusing on the core generative AI workflow. In a production environment, additional AWS services like AWS Identity and Access Management (IAM), Amazon CloudWatch, AWS X-Ray, AWS CloudTrail, AWS Web Application Firewall (WAF), and AWS Key Management Service (KMS) could be integrated to enhance security, observability, and operational governance.

    A request proceeds through the system as follows:

    1. User interaction via Slack: A developer posts a message in a support channel, for example, “@SupportBot, can I get a GitHub Copilot license?“
    2. Secure ingress via Amazon API Gateway & AWS Lambda: Slack sends the event to an Amazon API Gateway endpoint, which triggers an AWS Lambda function. This function performs a crucial security check, verifying the request’s cryptographic signature to confirm its authenticity from Slack.
    3. Decoupling via Amazon Simple Queue Service (SQS): The verified request is placed onto an Amazon SQS First-In, First-Out (FIFO) queue. This decouples the front-end from the agent, enhancing resilience. Using a FIFO queue with the message’s thread timestamp as the MessageGroupId ensures that replies within a single conversation are processed in order, which is vital for maintaining coherent conversations.
    4. Agent execution via Amazon Bedrock AgentCore: The SQS queue triggers a Lambda function upon message arrival, which activates the agent running in the AgentCore Runtime. AgentCore manages operational tasks, including orchestrating calls to the foundation model and the agent’s tools. The Orchestrator Agent’s logic, built with Strands Agents, analyzes the user’s prompt and determines the correct specialized agent to invoke—either the Knowledge Base Agent for a question or the GitHub Agent for an action request.

    A critical implementation detail involves the system’s use of AgentCore’s complete session isolation. To maintain conversational context, a unique, deterministic sessionId is generated for each Slack thread by combining the channel ID and the thread’s timestamp. This sessionId is passed with every agent invocation within that thread. Interactions in a thread share this same sessionId, ensuring the agent treats them as one continuous conversation. Meanwhile, interactions in other threads receive different sessionIds, keeping their contexts separate. Essentially, each conversation operates in an isolated session: AgentCore allocates separate resources per sessionId, preventing context and state leakage between threads. In practice, this means that if a developer sends multiple messages in one Slack thread, the agent retains the earlier parts of that conversation. Each thread’s history is automatically preserved by AgentCore.

    This session management strategy is also crucial for observability. Based on a unique sessionId, interactions can be traced using AWS X-Ray, offering insight into the flow – from the Slack message arriving at API Gateway to its enqueuing in SQS. It tracks the orchestrator’s processing, the call to the foundation model, subsequent tool invocations (such as a knowledge-base lookup or a GitHub API call), and finally the response back to Slack.

    Metadata and timing help indicate the flow of each step to understand where time is spent. If a step fails or is slow (for example, a timeout on an external API call), X-Ray pinpoints the cause. This is invaluable for quickly diagnosing problems and building confidence in the system’s behavior.

    The Solution: A Reusable Blueprint Powered by AWS

    The Bot Factory architecture, developed by the AutoScout24 and AWS teams, is event-driven, serverless, and built upon managed AWS services. This approach provides a resilient and scalable pattern adaptable for new use cases.

    The solution leverages Amazon Bedrock and its integrated capabilities:

    • Amazon Bedrock provides access to high-performing foundation models (FMs), serving as the reasoning engine for the agent.
    • Amazon Bedrock Knowledge Bases enables the RAG capability, allowing the agent to connect to AutoScout24’s internal documentation and retrieve information to answer questions accurately.
    • Amazon Bedrock AgentCore is a key operational component of the blueprint. It offers the fully managed, serverless runtime environment for deploying, operating, and scaling agents.

    This solution offers a significant advantage for AutoScout24. Instead of developing foundational infrastructure for session management, security, and observability, the company utilizes AgentCore’s purpose-built services. This allows the team to concentrate on the agent’s business logic rather than the underlying infrastructure. AgentCore also includes built-in security and isolation features. Each agent invocation runs in its own isolated container, helping to prevent data leakage between sessions. Agents are assigned specific IAM roles to restrict their AWS permissions, adhering to the principle of least privilege. Credentials or tokens required by agent tools (such as a GitHub API key) are securely stored in AWS Secrets Manager and accessed at runtime. These features provide a secure environment for running agents with minimal custom infrastructure.

    The agent itself was constructed using the Strands Agents SDK, an open-source framework that simplifies defining an agent’s logic, tools, and behavior in Python. This combination proved effective: Strands for building the agent, and AgentCore for securely running it at scale. The team adopted a sophisticated “agents-as-tools” design pattern, where a central orchestrator Agent acts as the main controller. This orchestrator does not contain the logic for every possible task. Instead, it intelligently delegates requests to specialized, single-purpose agents. For the support bot, this included a Knowledge Base agent for handling informational queries and a GitHub agent for executing actions like assigning licenses. This modular design facilitates extending the system with new capabilities, such as adding a PR review agent without re-architecting the entire pipeline. Running these agents on Amazon Bedrock further enhances flexibility, as the team can select from a broad range of foundation models. More powerful models can be applied to complex reasoning tasks, while lighter, cost-efficient models are suitable for routine worker agents such as GitHub license requests or operational workflows. This ability to combine models allows AutoScout24 to balance cost, performance, and accuracy across its agent architecture.

    Orchestrator Agent: Built with Strands SDK

    The Strands Agents SDK helped define the orchestrator agent with concise, declarative code. The framework employs a model-driven approach, where developers focus on defining the agent’s instructions and tools, while the foundation model handles reasoning and planning. The orchestrator agent can be expressed in just a few dozen lines of Python. The example snippet below (simplified for clarity, not intended for direct use) illustrates how the agent is configured with a model, a system prompt, and a list of tools (which in this architecture represent the specialized agents):

    # A simplified, representative example of the orchestrator agent logic
    # built with the Strands Agents SDK and deployed on Amazon Bedrock AgentCore.
    from bedrock_agentcore.runtime import BedrockAgentCoreApp
    from strands import Agent
    from strands.models import BedrockModel
    from tools import knowledge_base_query_tool, github_copilot_seat_agent
    # Initialize the AgentCore application, which acts as the serverless container
    app = BedrockAgentCoreApp()
    class OrchestratorAgent:
        def __init__(self):
            # 1. Define the Model: Point to a foundation model in Amazon Bedrock.
            self.model = BedrockModel(model_id="anthropic.claude-3-sonnet-20240229-v1:0")
            
            # 2. Define the Prompt: Give the agent its core instructions.
            self.system_prompt = """
            You are a helpful and friendly support bot for the AutoScout24 Platform Engineering team.
            Your goal is to answer developer questions and automate common requests.
            Use your tools to answer questions or perform actions.
            If you cannot handle a request, politely say so.
            """
            
            # 3. Define the Tools: Provide the agent with its capabilities.
            # These tools are entry points to other specialized Strands agents.
            self.tools = [
                knowledge_base_query_tool, 
                github_copilot_seat_agent
            ]
            
            # Create the agent instance
            self.agent = Agent(
                model=self.model, 
                system_prompt=self.system_prompt, 
                tools=self.tools
            )
        def __call__(self, user_input: str):
            # Run the agent to get a response for the user's input
            return self.agent(user_input)
    # Define the entry point that AgentCore will invoke when a new event arrives from SQS
    @app.entrypoint
    def main(event):
        # Extract the user's query from the incoming event
        user_query = event.get("prompt")
        
        # Instantiate and run the orchestrator agent
        return OrchestratorAgent()(user_query)

    Another example is the GitHub Copilot license agent, implemented as a Strands tool function. The following snippet demonstrates its definition using the @tool decorator. This function creates a GitHubCopilotSeatAgent, passes the user’s request (a GitHub username) to it, and returns the result:

    from strands import Agent, tool
    class GitHubCopilotSeatAgent:
    def __call__(self, query: str):
    agent = Agent(model=self.model, system_prompt=self.system_prompt, tools=self.tools)
    return agent(query)
    
    @tool
    def github_copilot_seat_agent(github_username: str) -> str:
    agent = GitHubCopilotSeatAgent() response = agent(f"Request GitHub Copilot license for user: {github_username}")
    return str(response)

    Key benefits of this approach include a clear separation of concerns. Developers write declarative code focused on the agent’s purpose. Complex infrastructure logic, including scaling, session management, and secure execution, is managed by Amazon Bedrock AgentCore. This abstraction enables rapid development, allowing AutoScout24 to move from prototype to production more quickly. The tools list effectively makes other agents callable functions, enabling the orchestrator to delegate tasks without needing to know their internal implementation.

    The Impact: A Validated Blueprint for Enterprise AI

    The Bot Factory project delivered results beyond the initial prototype, creating immediate business value and establishing a strategic foundation for future AI innovation at AutoScout24. Key outcomes included:

    • A production-ready support bot: A functional Slack bot was deployed, actively reducing the manual support load on the AutoScout24 AI Platform Engineering Team by addressing the 30% of time previously spent on repetitive tasks.
    • A reusable Bot Factory blueprint: The project yielded a validated, reusable architectural pattern. Teams at AutoScout24 can now build new agents by starting with this proven template (Slack -> API Gateway -> SQS -> AgentCore). This significantly accelerates innovation by allowing teams to focus on their unique business logic, rather than reinventing infrastructure. This modular design also prepares them for more advanced multi-agent collaboration, potentially using standards like the Agent-to-Agent (A2A) protocol as needs evolve.
    • Enabling broader AI development: By abstracting away infrastructure complexity, the Bot Factory empowers more individuals to build AI solutions. A domain expert in security or data analytics can now create a new tool or specialized agent and integrate it into the factory without needing expertise in distributed systems.

    Conclusion: A New Model for Enterprise Agents

    AutoScout24’s collaboration with AWS transformed fragmented generative AI experiments into a scalable, standardized framework. By adopting Amazon Bedrock AgentCore, the team successfully moved its support bot from prototype to production, while maintaining focus on its Bot Factory vision. AgentCore manages session state and scaling, allowing engineers to concentrate on high-value business logic instead of infrastructure. The result is more than a support bot: it is a reusable foundation for building enterprise agents. With AgentCore, AutoScout24 can efficiently transition from prototype to production, setting a model for how organizations can standardize generative AI development on AWS.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleWhat’s New at Stack Overflow: January 2026
    Next Article GPT-5.2-Codex is now generally available in GitHub Copilot
    Samuel Alejandro

    Related Posts

    AI

    Demis Hassabis and John Jumper Receive Nobel Prize in Chemistry

    February 21, 2026
    Tech

    Google Introduces Lyria 3: A Free AI Music Generator for Gemini

    February 21, 2026
    AI

    SIMA 2: An Agent that Plays, Reasons, and Learns With You in Virtual 3D Worlds

    February 19, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Latest Post

    ChatGPT Mobile App Surpasses $3 Billion in Consumer Spending

    December 21, 202513 Views

    Creator Tayla Cannon Lands $1.1M Investment for Rebuildr PT Software

    December 21, 202511 Views

    Automate Your iPhone’s Always-On Display for Better Battery Life and Privacy

    December 21, 202510 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    About

    Welcome to NodeToday, your trusted source for the latest updates in Technology, Artificial Intelligence, and Innovation. We are dedicated to delivering accurate, timely, and insightful content that helps readers stay ahead in a fast-evolving digital world.

    At NodeToday, we cover everything from AI breakthroughs and emerging technologies to product launches, software tools, developer news, and practical guides. Our goal is to simplify complex topics and present them in a clear, engaging, and easy-to-understand way for tech enthusiasts, professionals, and beginners alike.

    Latest Post

    Anker’s X1 Pro shouldn’t exist, but I’m so glad it does

    February 22, 20260 Views

    Suspected Russian Actor Linked to CANFAIL Malware Attacks on Ukrainian Organizations

    February 22, 20260 Views

    Trump Reinstates De Minimis Exemption Suspension Despite Supreme Court Ruling

    February 22, 20260 Views
    Recent Posts
    • Anker’s X1 Pro shouldn’t exist, but I’m so glad it does
    • Suspected Russian Actor Linked to CANFAIL Malware Attacks on Ukrainian Organizations
    • Trump Reinstates De Minimis Exemption Suspension Despite Supreme Court Ruling
    • How Cloudflare Mitigated a Vulnerability in its ACME Validation Logic
    • Demis Hassabis and John Jumper Receive Nobel Prize in Chemistry
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer
    • Cookie Policy
    © 2026 NodeToday.

    Type above and press Enter to search. Press Esc to cancel.