Part 8 — Specialisation  ·  Track C of 4
Track C — AI Automation Engineer
Build enterprise automation workflows connecting AI to business systems
⏱ 2–3 Weeks 🟠 Advanced 🔧 MCP · n8n · Slack API · Jira · Webhooks
🎯

Track Overview

Specialisation C

Build enterprise automation workflows that connect AI to business systems. This is the fastest-growing segment of AI employment: companies need engineers who can integrate AI with their existing software stacks without rebuilding everything from scratch.

Skills You Will Build

  • Model Context Protocol (MCP) — Anthropic's open standard for AI tool integration
  • n8n visual workflow automation with custom HTTP nodes for AI
  • Building Slack bots that respond with AI-generated answers
  • Jira ticket creation from AI classification output
  • Webhook-driven automation: receive event, classify with AI, act
  • Error handling and dead letter queues for multi-step workflows
🔧

Model Context Protocol — Build an MCP Server

Standard

MCP is Anthropic's open standard for connecting AI to external tools. Build an MCP server once and it works with Claude Desktop, Claude Code, and any other MCP-compatible client.

pip install mcp

# mcp_server.py — expose your tools to any MCP client
from mcp.server import Server
from mcp.types import Tool, TextContent
import mcp.server.stdio as stdio
from mcp.server.models import InitializationOptions
import asyncio

server = Server("my-enterprise-tools")

@server.list_tools()
async def list_tools():
    return [
        Tool(
            name="search_knowledge_base",
            description="Search internal documentation. USE for product/process questions. DO NOT USE for general knowledge.",
            inputSchema={
                "type": "object",
                "properties": {"query": {"type": "string", "description": "Natural language search query"}},
                "required": ["query"]
            }
        ),
        Tool(
            name="create_jira_ticket",
            description="Create a Jira bug ticket. USE when user reports a software defect.",
            inputSchema={
                "type": "object",
                "properties": {
                    "summary": {"type": "string"},
                    "description": {"type": "string"},
                    "priority": {"type": "string", "enum": ["Low","Medium","High","Critical"]}
                },
                "required": ["summary", "description"]
            }
        ),
        Tool(
            name="get_user_info",
            description="Look up employee information by name or email.",
            inputSchema={
                "type": "object",
                "properties": {"identifier": {"type": "string"}},
                "required": ["identifier"]
            }
        )
    ]

@server.call_tool()
async def call_tool(name: str, arguments: dict):
    if name == "search_knowledge_base":
        # Call your RAG pipeline
        results = await rag_search(arguments["query"])
        return [TextContent(type="text", text=str(results))]
    elif name == "create_jira_ticket":
        ticket = await create_jira(arguments)
        return [TextContent(type="text", text=f"Created: {ticket['key']}")]
    elif name == "get_user_info":
        user = await lookup_user(arguments["identifier"])
        return [TextContent(type="text", text=str(user))]

async def main():
    async with stdio.stdio_server() as (read, write):
        await server.run(read, write, InitializationOptions(server_name="my-enterprise-tools"))

if __name__ == "__main__":
    asyncio.run(main())

# Register in ~/.config/claude/claude_desktop_config.json:
# {
#   "mcpServers": {
#     "my-enterprise-tools": {
#       "command": "python",
#       "args": ["/path/to/mcp_server.py"]
#     }
#   }
# }
🔗

n8n Workflow Automation

Visual Automation
# Run n8n locally
docker run -it --rm -p 5678:5678 -v ~/.n8n:/home/node/.n8n n8nio/n8n
# Open http://localhost:5678

# n8n workflow: AI email triage
# [Gmail trigger: new email arrives]
#   -> [HTTP Request: POST to your FastAPI /classify {subject, body}]
#   -> [Switch node: route on classification]
#       -> billing: [Stripe API: fetch customer] -> [Gmail: send billing reply]
#       -> technical: [Jira: create ticket] -> [Slack: notify #eng-support]
#       -> general: [HTTP: POST /generate-reply] -> [Gmail: send AI reply]

# Your FastAPI endpoint that n8n calls:
from fastapi import FastAPI
from pydantic import BaseModel
import instructor, anthropic
from typing import Literal

app = FastAPI()
instr_client = instructor.from_anthropic(anthropic.Anthropic())

class EmailRequest(BaseModel):
    subject: str
    body: str

class ClassifyResponse(BaseModel):
    category: Literal["billing", "technical", "general", "complaint"]
    confidence: float
    suggested_response: str

@app.post("/classify")
async def classify_email(request: EmailRequest) -> ClassifyResponse:
    return instr_client.messages.create(
        model="claude-3-haiku-20240307",
        max_tokens=256,
        messages=[{"role": "user", "content":
            f"Classify this support email and suggest a brief response.\n\n"
            f"Subject: {request.subject}\nBody: {request.body[:500]}"}],
        response_model=ClassifyResponse
    )

@app.post("/generate-reply")
async def generate_reply(request: EmailRequest) -> dict:
    # Use RAG to ground the reply in documentation
    context = await rag_search(request.subject + " " + request.body[:200])
    response = anthropic.Anthropic().messages.create(
        model="claude-3-5-sonnet-20241022", max_tokens=512,
        messages=[{"role": "user", "content":
            f"Write a professional reply to this email.\n\n"
            f"Context: {context}\n\nEmail: Subject: {request.subject}\n{request.body}"}]
    )
    return {"reply": response.content[0].text}
💬

Slack Bot with AI Responses

Chat Integration
pip install slack-sdk slack-bolt

from slack_bolt import App
from slack_bolt.adapter.fastapi import SlackRequestHandler
import anthropic

slack_app = App(token=os.environ["SLACK_BOT_TOKEN"],
                signing_secret=os.environ["SLACK_SIGNING_SECRET"])
handler = SlackRequestHandler(slack_app)
client = anthropic.Anthropic()

@slack_app.event("app_mention")
def handle_mention(event, say, client_slack):
    user_message = event["text"]
    channel = event["channel"]
    user = event["user"]

    # Show typing indicator
    client_slack.reactions_add(channel=channel, timestamp=event["ts"], name="thinking_face")

    # Classify and respond
    category = classify_request(user_message)

    if category == "technical":
        # RAG-powered answer
        result = rag_query(user_message)
        response = result["answer"]
        sources = "\n".join([f"• {s['source']}" for s in result["sources"][:3]])
        say(f"{response}\n\nSources:\n{sources}", thread_ts=event["ts"])

    elif category == "ticket":
        # Create Jira ticket
        ticket = create_jira_ticket(
            summary=user_message[:80],
            description=f"Reported by <@{user}> in Slack:\n{user_message}"
        )
        say(f"Created Jira ticket: *{ticket['key']}*\n{ticket['url']}",
            thread_ts=event["ts"])

    else:
        # General AI response
        response = client.messages.create(
            model="claude-3-5-sonnet-20241022", max_tokens=512,
            messages=[{"role": "user", "content": user_message}]
        )
        say(response.content[0].text, thread_ts=event["ts"])

    # Remove thinking indicator
    client_slack.reactions_remove(channel=channel, timestamp=event["ts"], name="thinking_face")

# FastAPI handler for Slack events
from fastapi import FastAPI, Request
api = FastAPI()

@api.post("/slack/events")
async def endpoint(req: Request):
    return await handler.handle(req)
💼

Enterprise API Integrations

Connectors
import requests
import os

# ── Jira: create tickets from AI classification ────────
def create_jira_ticket(summary: str, description: str,
                        priority: str = "Medium", project: str = "ENG") -> dict:
    response = requests.post(
        f"{os.environ['JIRA_BASE_URL']}/rest/api/3/issue",
        auth=(os.environ['JIRA_EMAIL'], os.environ['JIRA_API_TOKEN']),
        json={
            "fields": {
                "project": {"key": project},
                "summary": summary,
                "description": {
                    "type": "doc", "version": 1,
                    "content": [{"type": "paragraph",
                                 "content": [{"type": "text", "text": description}]}]
                },
                "issuetype": {"name": "Bug"},
                "priority": {"name": priority}
            }
        }
    )
    data = response.json()
    return {"key": data["key"], "url": f"{os.environ['JIRA_BASE_URL']}/browse/{data['key']}"}

# ── Notion: append AI summary to a page ───────────────
def append_to_notion_page(page_id: str, ai_summary: str):
    requests.patch(
        f"https://api.notion.com/v1/blocks/{page_id}/children",
        headers={"Authorization": f"Bearer {os.environ['NOTION_TOKEN']}",
                 "Notion-Version": "2022-06-28"},
        json={"children": [{"type": "callout", "callout": {
            "rich_text": [{"type": "text", "text": {"content": f"AI Summary: {ai_summary}"}}],
            "icon": {"emoji": "🤖"}
        }}]}
    )

# ── Webhook receiver: receive event, classify, act ────
from fastapi import FastAPI, BackgroundTasks

app = FastAPI()

@app.post("/webhook/{source}")
async def receive_webhook(source: str, payload: dict, background_tasks: BackgroundTasks):
    # Return 200 immediately (webhooks timeout at 30s)
    background_tasks.add_task(process_webhook, source, payload)
    return {"status": "received"}

async def process_webhook(source: str, payload: dict):
    try:
        if source == "github":
            if payload.get("action") == "opened" and "pull_request" in payload:
                pr = payload["pull_request"]
                review = await ai_review_pr(pr["diff_url"], pr["title"])
                await post_github_comment(pr["comments_url"], review)
        elif source == "stripe":
            event_type = payload.get("type", "")
            if event_type == "payment_intent.payment_failed":
                customer_id = payload["data"]["object"]["customer"]
                await notify_customer_of_failure(customer_id)
    except Exception as e:
        logger.error(f"Webhook processing failed: {e}", source=source)
🛠 Capstone: Enterprise AI Automation System 2–3 weeks

Build an end-to-end enterprise automation that handles real support requests with AI.

Requirements

  • Slack bot that receives @mentions and responds with AI (RAG-backed for technical questions)
  • Automatic Jira ticket creation when classification = "bug report"
  • Daily digest Slack message: summary of tickets created, questions answered, topics covered
  • MCP server with 3+ tools registered in Claude Desktop
  • n8n workflow connecting at least 2 external systems
  • Error handling: if any step fails, log to structured JSON and send Slack alert to ops channel

Deploy as a persistent service on Railway or Render. Test with real Slack channel for 1 week.

MASTERY CHECKLIST

When complete: move to Part 9 — Portfolio and Launch.

← Track B: LLM Engineer All Modules Next: Track D — Data Scientist →