Features
AI Capabilities

AI Capabilities

Zcrafter leverages Large Language Models (LLMs) to act as an intelligent pair programmer for mainframe development.

AI Agents

Zcrafter uses specialized "Agents" for different tasks:

  1. Coder Agent: The primary agent for answering questions, writing code, and analyzing problems. It has access to your current file context and project structure.
  2. Summarizer Agent: Generates concise summaries of long outputs or file contents.
  3. Task Agent: Breaks down complex requests into smaller, actionable steps.
  4. Title Agent: Generates short, descriptive titles for your conversation history.

Context Management

The AI is only as good as the context it has. Zcrafter automatically manages context for you:

  • Current File: When you have a file open (e.g., a COBOL program), its content is sent to the AI.
  • Conversation History: The AI remembers previous questions and answers in the current session.
  • Project Context: You can configure the AI to be aware of your entire project structure (see Configuration).

Prompt Engineering for Mainframe

To get the best results for COBOL, JCL, and PL/I:

  • Be Specific: Instead of "Fix this," say "Fix the S0C7 abend in the calculation paragraph."
  • Provide Error Codes: Always include ABEND codes (e.g., S0C4, S806) or error messages.
  • Ask for Explanations: "Explain what this JCL step does line-by-line."

Available Models

Zcrafter automatically updates to support the latest models from all major providers as soon as they are released. This includes:

  • OpenRouter: Access to all top-tier models (Claude 3.5+, GPT-4o+, etc.).
  • Anthropic: Latest Claude Sonnet, Opus, and Haiku models.
  • OpenAI: Latest GPT-4 and o1/o3 series models.
  • Google: Latest Gemini Pro and Flash models.
  • Amazon Bedrock: Access to Claude 3.5/4.5 via AWS.
  • Groq: High-speed inference for Llama 3 and other open-source models.

Enterprise Providers Zcrafter also supports connection to:

  • Azure OpenAI: For enterprise-managed GPT deployments.
  • Vertex AI: For Google Cloud-managed Gemini deployments.
  • GitHub Copilot: Integrate directly with your Copilot subscription.