Suyash Mohan

Create AI Agents Using Go lang

· Suyash Mohan

When it comes to AI, almost every example I’ve seen is in Python. Don’t get me wrong—Python is great—but it’s not my language of choice. I love Go, and I’d much rather use it even for building AI agents. Fortunately, Go has solid support for AI applications. Let’s explore how you can use Go to create AI agents.

There’s no denying that Python’s ecosystem for machine learning and AI is fantastic. Libraries like PyTorch power countless models and frameworks. But the AI industry is naturally dividing into two phases:

  1. Model Development, where you train and refine AI models (usually in Python).
  2. AI Engineering, where you consume model capabilities—exposed via APIs—to build real-world applications.

If your goal is to build AI agents, you don’t need to train your own LLMs; you just need reliable API access to them.

I prefer Go because it’s statically typed and compiled. That means I catch many errors at compile time, and I can focus purely on solving the problem rather than wrestling with dynamic-typing quirks. Go is also easy to learn yet powerful, so I can get an agent up and running quickly.

Luckily, all the major LLM providers offer Go SDKs—OpenAI, Anthropic, Google—and even Ollama (if you’re running locally) provides a Go client (not surprising since Ollama itself is written in Go).

But an AI agent needs more than just an LLM SDK. You’ll probably integrate with things like Slack, vector databases, and other tools—and Go has libraries for all of these. There’s nothing stopping you from building a full-featured AI agent in pure Go.

To prove it, I’m building a simple Slack-based AI agent in Go. Over the coming weeks, I’ll share my learnings here and publish the code on GitHub. In the first version:

  • Slack integration uses the official Go SDK in socket mode (so I don’t need a public HTTP endpoint).
  • The agent listens only for app_mention events. When someone tags the bot, I extract the message, forward it to OpenAI’s ChatGPT API, and post the reply back to Slack.
  • Under the hood, I call the Chat Completions endpoint:
client := openai.NewClient(
    option.WithBaseURL(BASE_URL),
    option.WithAPIKey(API_KEY),
)

resp, err := client.Chat.Completions.New(ctx, openai.ChatCompletionNewParams{
    Model: "gpt-4.1-mini",
    Messages: []openai.ChatCompletionMessageParamUnion{
        openai.SystemMessage("You are a helpful assistant."),
        openai.UserMessage("What is the capital of Malaysia?"),
    },
})
if err != nil {
    log.Fatalf("OpenAI API error: %v", err)
}

fmt.Println(resp.Choices[0].Message.Content)

That’s all it takes to get a response (you can also stream responses if you prefer).

On the Slack side, socket mode is trivial to set up and avoids the hassle of exposing a public URL. Your handler might look something like this:

app := slack.New(SLACK_TOKEN, slack.OptionAppLevelToken(APP_TOKEN))
socketClient := socketmode.New(
    app,
    socketmode.OptionDebug(true),
)

socketClient.HandleEvents(socketmode.EventTypeEventsAPI, func(evt *socketmode.Event, client *socketmode.Client) {
    eventsAPIEvent, _ := evt.Data.(slackevents.EventsAPIEvent)
    if eventsAPIEvent.Type == slackevents.CallbackEvent {
        inner := eventsAPIEvent.InnerEvent
        switch ev := inner.Data.(type) {
        case *slackevents.AppMentionEvent:
            go func() {
                userMsg := ev.Text
                aiResp := queryOpenAI(userMsg)
                app.PostMessage(ev.Channel, slack.MsgOptionText(aiResp, false))
            }()
        }
    }
    client.Ack(*evt.Request)
})

With this minimal setup, you have a working Slack bot powered by Go and OpenAI. Next, I plan to add more advanced features, like tool usage and retrieval-augmented generation (RAG).

Stay tuned—I’ll post updates and the full source code is available on GitHub!