Why You Should Consider Using MCP servers with GitHub Copilot

Ever found yourself wishing GitHub Copilot understood a brand-new feature or recent update you want to use? Maybe you’re writing Terraform and GitHub Copilot claims a resource value doesn’t exist – even though you know it does.  This isn’t a limitation of GitHub Copilot itself, but rather of the underlying AI model it’s using (like GPT-4, Sonnet 4, or others), which may not be aware of the very latest changes. In this blog, I’ll explain what MCP servers are, why they matter, and how they can help solve exactly these kinds of problems.

GitHub Copilot is a powerful AI assistant, but its knowledge and suggestions are only as current as the AI model it’s using. GitHub Copilot supports a range of models – like GPT-4o, Claude Sonnet 4, Gemini, and others – each with different strengths, weaknesses, and update cycles. If the model you’re using was trained before a new feature or resource was released, GitHub Copilot won’t know about it—not because GitHub Copilot is out of date, but because the model it relies on is.

You can actually choose which model GitHub Copilot uses for your tasks, with options for speed, reasoning ability, and up-to-date knowledge. But even the best models have a knowledge cutoff, and that’s where MCP servers come in.

What Are MCP Servers? (And Why Should You Care?)

Imagine GitHub Copilot as your trusty co-pilot in the cockpit of a complex jet. It’s great at reading the dials and flipping switches you point out, but sometimes it misses the bigger picture – like what’s happening outside the window. That’s where MCP servers (Model Context Protocol servers) come in. They act like air traffic control, feeding GItHub Copilot real-time, contextual information about your project, your tools and even your cloud services.

At its core, a MCP (Model Context Protocol) Server acts as a tailored gateway for AI tools like GitHub Copilot. Instead of relying solely on the current models knowledge, MCP servers can provide a custom runtime, complete with your preferred tools, libraries, or provider versions.

While my example focuses on Terraform, the power of MCP servers goes far beyond infrastructure as code. Here’s how they can transform your workflow across the tech stack:

  • Cloud Platforms: Cloud providers like AWS, Azure, and GCP are constantly rolling out new services and APIs. An MCP server can feed your AI assistant real-time documentation, usage examples, and even best practices as soon as they’re available, so you’re always working with the latest cloud innovations – never left behind by yesterday’s docs.
  • Programming Languages and Frameworks: Whether it’s a shiny new Python library, the latest JavaScript framework update, or a tweak in your CI/CD pipeline, MCP servers ensure your AI assistant is always in sync. Imagine GitHub Copilot suggesting idiomatic code for a library released last week, or catching breaking changes in a framework before they trip you up.
  • DevOps and Infrastructure Tools: Beyond Terraform, tools like Ansible, Kubernetes, or Pulumi evolve rapidly. With an MCP server, your AI assistant can surface the newest modules, manifests, and deployment patterns right in your editor. No more scouring changelogs or piecing together examples from scattered sources.

Wanting GitHub Copilot to know about the latest features in Terraform

Recently I was wanting to use a new Azure networking feature with my Terraform:  The ip_address_pool block for azurerm_virtual_network just landed in version 4.32.0 of the AzureRM Terraform provider, released on 5th June 2025 – barely two weeks ago.

I knew it was possible, begun writing my Terraform and then on a slightly different note I was wanting GitHub Copilot to assist me on a few things, but this is where the problems started. The model I was using with GitHub Copilot did not have the knowledge of the later Terraform version and was telling me it was incorrect.

Let me show you an example without having Terraform MCP server configured:

Without Terraform MCP server configured, it says its not possible

Enabling Terraform MCP Server for GitHub Copilot to use

Once you connect GitHub Copilot to this server, it gains instant awareness of the latest provider features and documentation. Suddenly, it can:

  • Suggest the correct syntax for ip_address_pool
  • Validate your code against the latest provider schema
  • Surface up-to-date examples and best practices

It’s like flipping a switch – GitHub Copilot goes from “never heard of it” to “here’s exactly how you use it”

Terraform MCP server configured, it provides the relevant documentation and how to use

Even better, it provides me now with example usage and some important notes for me to consider with my Terraform deployment

With and without MCP server enabled

Sometimes without MCP servers to be considered, you’re left:

  • Manually troubleshooting new features
  • Missing out on GitHub Copilot’s full potential
  • Copy/pasting from Terraform documentation

With MCP server enabled, you get:

  • Real-time, accurate Terraform help
  • Less context-switching and frustration
  • The ability to use brand-new features (like ip_address_pool) as soon as they’re released
ScenarioWithout MCP ServerWith MCP Server
Recognises new provider featuresNoYes
Suggests up to date syntaxNoYes
Reduces manual researchNoYes
Keeps pace with Terraform updatesNoYes

Wrapping up

If you want GitHub Copilot to help you with the latest features, remember: it’s the underlying AI model (like Sonnet 4, GPT-4o, Claude, etc.) that determines what GitHub Copilot knows. MCP servers ensure your chosen model always has the latest, most relevant context – unlocking its full potential.

Next time you’re itching to use the latest Terraform features – or anything else on the cutting edge – don’t let GitHub Copilot fall behind. Set up an MCP server and watch your workflow transform, whether you’re building infrastructure, deploying to the cloud, or coding with the newest frameworks.

Have you tried it yet? I’d love to hear your experiences!

Leave a Reply

Discover more from Thomas Thornton Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading

Discover more from Thomas Thornton Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading