All about Context: The Rise of MCP and Web Security Evolution
A deep dive into how context is everything - from 30 years of SSL/TLS disasters to MCP's lightning-fast adoption. Learn why MCP succeeded in 8 months where HTTP took decades.
All about Context
Is this thing secure?
Okay, elephant in the room: this is a long article. But here's why that matters - like MCP itself, it starts slow then hits ludicrous speed. Don't blink during the timeline or you'll miss the entire future of AI tooling.
This isn't just a security story. It's about how context is the missing ingredient that made SSL/TLS take 30 years to get right, let MCP succeed in 8 months, separates Terry-level craftsmanship from hack jobs, makes jokes work (or fail), and connects every idea in this article. Context is everything.
π Choose Your Adventure:
πΆββοΈ Full Journey - Complete evolution story (25 min)
π Learn from History - Why web security failed + MCP lessons (15 min)
πββοΈ Speed Run - MCP timeline (8 min)
π¬ Deep Dive - Advanced MCP architecture (5 min)
Pre 2025 web 'evolution' seemed fast at the time, below I'll compare this to the current state-of-the-art.
Seriously, software development, as any profession that deeply effects humans (and the world we live in) is an art. I'm not saying everyone is a good artist.
I'm saying that some do it for the dough, other do it for β€οΈ. Promise this is valid context for this exciting scrolling journey ahead. I'm sure you'll read every word.
If you find yourself in the library (toilet), it's cool, everyone does it, it's kinda weird though so pretend you aren't. (swear this is relevant, reading foreplay)
For shits and giggles, take a side trip to π first website EVER! - don't forget to come back!
π that lock icon in your browser
HTTP - the thing that makes the πΈ οΈweb work. Every time you type a URL, click a link, or submit a form, (most of your 'phone calls' too!) you're using HTTP. It's the universal language that lets your browser talk to servers. Without it, there'd be no web as we know it. No Amazon, no social media, no cat videos. HTTP is so ubiquitous that we don't even think about it - it's just there, like electricity or running water.
Al Gore didn't invent the internet, a huge posse of people did. Read on to learn more.
HTTP 0.9 was introduced in 1991, it did one thing, GET
. This is analogous to the introduction of MCP in Nov. 2024.
It was almost a year later in 1992 before Tim Berners Lee @timbοΈ added very basic, completely insecure Basic Authentication
. 4 years later, RFC 1945 is born, HTTP/1.0
π₯Έ Because https wasn't a thing yet, a better name would have been
whisper your password quietly and hope no one is listening
. Fun fact, it usesBase64
encoding. It's basically, a=1, b=2, etc.
'That'll do' Years (1994-200X)
You gotta understand - everything was new. I mean EVERYTHING. We didn't have the computing primitives we take for granted today. JavaScript? Didn't exist until December 1995. CSS? Not until 1996. Hell, most people were still using dial-up modems that made sounds like R2-D2 having a seizure.
"That'll do, Pig. That'll do." - The perfect metaphor for 90s web development
We were literally building the foundations of modern computing while standing on quicksand. Every "standard" was really just "what Netscape/Microsoft/Cisco/start-up/whoever did last week."
Meanwhile, the webmasters/technicians/engineers/adminstrators/sysops, and a bunch of other amazing trade workers, factory workers... I can't name everyone. Let's just say lots of people did their 'thing' and figured it out, invested their time, toil and creativity.
Everyone had to constantly adapt, adjust and 'MacGyver that shit'.
The fact that ANY of this worked is a minor miracle. The fact that we're still using evolved versions of these technologies in 2025? That's testament to how solid these foundations turned out to be.
1994 πͺ Can i haz cookie?
Lou Montulli, founding engineer at Netscape. hey ran into a fundamental problem with the HTTP protocol: it's stateless.
This means your browser has amnesia - every time it talks to a server, it's like they've never met before. The server has no memory of previous requests. Now you could build something like a virtual shopping cart, or do nefarious tracking of unknowing users.
Montulli devised the "cookie" as a small piece of data that the server could send to the browser, which the browser would then store and send back with every subsequent request to that same server. This allowed the server to "remember" the user across multiple page views, enabling login sessions, shopping carts, and personalization.
Side-timeline on cookies, it was 1995 when the public could use Cookies, if they had the latest Netscape, IE followed suit. Keep in mind, it would take hours to download and you might not have a way to "back-up" you hard drive. You might not have enough space either :D
It took until February 1997 for RFC 2109 to make cookies "official." It was 2011 before there were any significant cookie updates.
1995 SSL 2.0 (and HTTP/1.0)
In February 1995, Netscape released SSL 2.0 to the world. Now we've got "secure" communications! Well, sorta...
Meanwhile, HTTP/1.0 (RFC 1945) showed up late to the party in November 1995, basically documenting what everyone was already doing in the wild.
1996 SSL 3.0
SSL 2.0 was broken faster than a politician's promise, so in 1996 they released SSL 3.0. Third time's the charm, right?
This time, they brought in the big guns. Paul Kocher (a 23-year-old cryptography wunderkind), Phil Karlton, and Alan Freier completely redesigned the protocol from scratch. No more band-aids on bullet wounds. Kocher, who would later win the Marconi Prize for this work, built something that was actually... good?
The genius of SSL 3.0 wasn't just that it fixed the security holes - it was designed to evolve. Kocher anticipated that future research would discover new attacks and algorithms, so he built in the ability to negotiate sessions and swap out weak algorithms for stronger ones without breaking the whole system.
Fun fact: Kocher was mostly self-taught in cryptography and originally planned to be a veterinarian. Instead, he ended up securing the entire internet. Life's funny that way.
1997 HTTP/1.1 - The Real Deal
In January 1997, HTTP/1.1 (RFC 2068) finally arrived with the killer feature: the Host:
header.
π This allowed servers to host more than one 'domain' on a server. For you newbs, back in the day, if you wanted to run more than one
website
, you needed another physical server. You might even need a bridge or MAU (shout-out to TokenRing).
As far as security goes, it was little bit better, every time a server prompted for a password, it would send a "nonce", random one-time-use password. Kinda like those annoying "2FA" messages... π
The client would use the nonce, and hash-up with the username and password.
A modern computer can "crack" this type of hash in 10 minutes (way faster on the big iron).
1999 TLS 1.0
So π₯³ party on right? Haha, not so fast there keyboard commander. SSL "3.0" was broken as a politician's promise (love u Grandpa)!
No big deal, they -> IETF "fixed" it in January 1999 with "TLS 1.0" (RFC 2246).
The name change was because Netscape 'owned' SSL. Queue modem sounds....
Oh no, 1.0, those are always broken right?
Speaking of parting and v1.0 .. Y2K!
While we were all panicking about two-digit years breaking everything, we were literally sending passwords in plain text over the internet. Y2K: the greatest misdirection in tech history.
Fun fact: More damage was probably done by companies rushing untested Y2K "fixes" into production than the actual date rollover would have caused. But hey, at least consultants got new boats!
I finished my schooling and 'officially' entered the workforce during peak Y2K hysteria. First lesson learned: fear sells better than actual security fixes.
2000's
During these ancient times, it was not only rocket-science to get a certificate, (WtF is a CSR? You want me to run what commands?) you had to pay.
Certifcate Authorities
π Network Solutions, π―ββοΈGoDaddy, and others made a small fortune on certificate signing. Safe estimates of π° $700 M to $1.5B (big B!) annually
2006 TLS 1.1
Yeah, TLS 1.0 was broken alright, and it wasn't a 'legal issue' this time. In 2006 it was fixed with TLS 1.1. I'm sure everybody updated all their software...
First rule of programming
Never 'roll your own' encryption. Not everyone listens (and that's a good thing IMHO).
Some cool cats (and countless other direct and indirect contributors) were like, someone's gotta do it, so they did (keep reading for details).
Who knows what goes on in the mind of folks on this level? I can only imagine, and probably not vividly enough.
On behalf of the entire human race, I want to thank all the folks that made our world! There are so many, but I'm gonna talk about two:
Tim Dierks π
A Long History with TLS: Tim Dierks has been a central figure in the development of TLS for a very long time.
Before authoring TLS 1.2, he was also a co-author of the previous versions, TLS 1.0 (RFC 2246) and TLS 1.1 (RFC 4346).
Role as Editor: His role was that of an "editor" or lead author for the IETF working group. He was responsible for taking the discussions, proposals, and consensus from the community and drafting them into a coherent and precise technical specification.
Industry Experience: At the time, he had extensive experience in the security industry, having worked for companies like Certicom (a pioneer in elliptic curve cryptography) and was later a key engineer at Google. His deep, long-term involvement made him the natural choice to continue leading the documentation of the protocol's evolution.
Eric Rescorla π
A Renowned Security Expert: Eric Rescorla is one of the most respected and influential figures in applied cryptography and internet security. He is a prolific researcher and engineer who has contributed to numerous security standards.
Deep Technical Expertise: He brought immense technical depth to the project. He is the author of the widely respected book "SSL and TLS: Designing and Building Secure Systems" and has taught security courses at Stanford University.
The Future of TLS: His involvement didn't stop with 1.2. Underscoring his central role in the protocol's evolution, Eric Rescorla went on to become the sole author of the next-generation standard, TLS 1.3 (RFC 8446), which was a massive overhaul of the protocol.
Industry Leadership: He has held prominent roles, including CTO of the Mozilla Foundation, where he was responsible for the security and technical direction of the Firefox browser.
2008 TLS 1.2
So two years go by, Tim and Eric (renowned developers, we can't thank these people enough π) are finally starting to sleep at night. The recurring nightmare of their elderly parents asking why 'this nice, foreign sounding man said he has my login?'.
I might have made part of that up, but you get the point.
Untold umteen-quadrillions of 'secure' bytes served, PCI DSS was breathing down the necks of everyone!
Proactively, a team lead by Tim and Eric made many improvements and TLS 1.2 was released.
And they all lived happily ever after.... Nope. It lasted until the BEAST showed up 2011.
Insider secret, there were many powerful organizations that were aware of this. SSL was never secure. If you look at the history of electronic communication, it never was/is. Unless you are the "powerful organization". Turns out that's not all you need, just ask Pete Hegseth :)
2011 Beast - Browser Exploit Against SSL/TLS."
Turns out TLS 1.2 still allowed for insecure "encryption" in older browsers. This was done intentionally (~2007), to make sure no-body was left out. The Internet was booming at that time, and nobody wanted to waste their precious SEO money and find out their mom can't get to their website on her "old computer". It was probably on AOL, and hadn't been "updated" since the factory.
You gotta understand, back then it was really slow to "Update your Apps", the many of the applications in use didn't even support automatic updates, or you had to pay for them.
2014 π Heartbleed π©Έ AND POODLE!
β€οΈβπ₯ The internet was on fire! Padding Oracle On Downgraded Legacy Encryption. Again, older browsers were at risk. SSL 3.0 allowed for insecure communication!
Heartbleed was the worst... The code was broken. Catastrophic bug in OpenSSL. This time the protocol was fine - someone just built it wrong. Like having perfect blueprints but using duct tape instead of rivets.
Too bad it was "THE" implementation of it. Not only on just about everyone's computer/phone, but also in every server, camera, and every other dumb-idea-connected-to-the-internet (see here for an index.
I can recall stories of people scrambling asking "who knows how to work on OpenSSL?".
2015 π± FREAK (Mar) & πͺ΅ Logjam (May)
Again, legacy browsers supporting 90's "export-grade" (so-called cryptography) was weak sauce! They were part of a series of major SSL/TLS vulnerabilities that highlighted the urgent need for the kind of widespread, modern encryption that Let's Encrypt was created to provide.
- FREAK: An attacker could trick a modern browser and server into negotiating one of these ancient, weak "export" cipher suites, which could then be broken in hours.
- Logjam: A similar attack targeting the Diffie-Hellman key exchange algorithm when used with weak, common prime numbers.
2015-2020 HTTP/2 Takes Over
While we were all distracted by TLS drama, HTTP itself got a major upgrade. HTTP/2 (based on Google's SPDY) multiplexed connections, compressed headers, and finally made the web feel fast. By 2020, over 50% of websites were using it.
2015 π LetsEncrypt (Dec)
Finally, one of the 'powers that be' a few of them actually They have PDF, I was hoping to link you to a page full of logos... But alas, it was some cool cats:
- Josh Aas (of Mozilla fame)
- Peter Eckersly (EFF)
- J. Alex Halderman (University of Michigan)
These wonderful human beings were like, 'we' are all paying $1.5B+ a year and it's still insecure? At the time, it was very, very common to "ignore the security error" in your browser.
Side note...
I'm not sure what motivated them, but I gotta think these super-nerds realized how damaging it was for everyone to just "ignore" the security errors.
Maybe the "certificate authorities" knew this all along and were just seeing how long they could make it work?
My optimistic view is that initially, it was pretty universally thought that if it didn't "cost extra" and you had to fill out paperwork, to get a certificate, nobody would trust it?
I don't imagine an "app store" was in the minds of executives. As the TLS and SSL authors erred in the protocol and software implementation, so did the "business decision makers".
Maybe some of the exec's did see the light, because EFF, Mozilla Foundation, Akamai, Cisco, IdenTrust, Google Chrome, Facebook, Automattic and others busted out the checkbooks.
Woulda coulda shoulda
It seems that if we had it all to do-over again, the certificate would be included with your domain. Why we thought 'anybody' should get a domain and only those that meet minimum requirements and can afford the 10-100x additional annual cost, sounds kinda ludicrous.
There were other aspects at play, probably encryption restrictions, 'cold-war' type stuff. It's kinda hilarious when you consider the recent news of how goverment officals are conducting business on known, 3rd party channels that are insecure as is gets.
It's my opinion that the current situation would be improved if "everybody" had access, rather than just the 'bad actors'. Given that the public has been given proof of the messages, maybe this is the case now?
Ok, back to SSL, hope you're still here, MCP is still 9 years away (it's two more sections...)!
π Achievement Unlocked! π
You now understand web security better than people who built it in the 90s.
(Seriously, you just learned what took the industry 20 years to figure out)
π¦Έ 2018 TLS 1.3
Finally, Eric Rescorla basically said "let's rebuild this from scratch and do it right." Major overhaul, tons of improvements, and here we are today - mostly secure, mostly free, mostly working. After 4 years of work and endless debate, we got a protocol that actually fixed the fundamental problems instead of just patching holes.
2022 HTTP/3 - Because Why Not?
June 2022 saw the publication of HTTP/3 (RFC 9114). The madlads threw out TCP entirely and built it on top of QUIC (which runs on UDP).
Want to hear a funny UDP joke? The problem with UDP jokes is that not everyone gets them. context
You might not get it, and I don't care!
Yes, they rebuilt the entire internet protocol stack. Again.
The crazy part? It actually works. HTTP/3 is faster, more reliable on crappy connections (looking at you, mobile networks), and handles packet loss like a champ. As of 2024, 95% of browsers support it and about a third of major websites use it. It doesn't use TLS
Looking at Cloudflare Radar, you can see the stats: HTTP/1.x: ~10% HTTP/2: ~60% HTTP/3: ~30%
TLS 1.3 So here we are in 2025. From "please don't look at my password" in 1991 to quantum-resistant encryption and protocols that can handle your phone switching from WiFi to 5G without missing a beat. Not bad for 34 years of duct tape and good intentions.
Which brings us to...
π§ STRETCH BREAK! π§
You've been reading about security disasters for 15 minutes.
Stand up. Touch your toes. Get some water.
(The MCP part is coming up and your mind needs to be blown properly)
Cool Story Bro.
I know, thanks. I had a lot of fun writing this, reminiscing about the decades spent with computers. I've been doing it way too long to just say "the web". You wont find me on social media, I'm old school
That's not the end of the story. It's just the very beginning. I could go on about many other earth-shattering technologies, and the seemingly stubbornness nature of "the public" to adopt. We could talk about the old phone system and how you could whistle in a payphone and own AT&T (shout out to @LewPayne)
Just like the evolution of the web, this next part will go by WAY FASTER!
Remember how HTTP took 30+ years to get (mostly) right? Well, buckle up, because MCP is speedrunning that entire evolution.
I used wonder how long it would take before people would "get it", now I'm struggling to keep up! Spoiler, it gets worse in the future, unless you read this excellent article to the end.
If you're sharing this and want to skip the intro, use this link π Jump to MCP
It'd probably convert much better if you tell them the first part, then send them the link.. There's that 'context' again (keep reading)...
The rise of MCP
β±οΈ Fun fact: You've been reading for about as long as it took MCP to get its first 1,000 servers online.
(HTTP took 6 years to get basic authentication)
Before HTTP(s) and cheap hardware/software
Way back when I tried to show people that they could see images, updated in the last year, in moments - and for cheap, or even free, or buy something online - they said:
- What's a computer? Or what's a modem, or my favorite, 'are you talking about that internet super-highway thing?'.
- I already have CD-ROM's, hard disk (did you read that right?) and/or floppies.
- How would you even get pictures into a computer?
- Is that safe? (hehe, see what I did there...)
Before MCP
- Is that 'AI'? (eyes roll)
- Is that 'A1' (news reference)
- General Disbelief
- OMG the security holes
The 'straw that broke the camels back', nay the 'spark' I needed to put this together. I blame a lack of context being provided, by me to him.
- "I wouldn't use a 'CMP' (sic) server."
(quote from a multi-area, genius - musician/AI developer, all around amazing human)
Classic, we don't know what we don't know. One think I (am prettty sure I) know is that MCP is truly 'remarkable'. I've never seen such adoption so quickly. HTTP=Moore's Law, MCP=Huang's Law
MCP Timeline
Okay I said it was going to get fast, look the timeline below!
keep scrolling for the punchline
MCP Timeline
Okay I said it was going to get fast, look the timeline below!
keep scrolling for the punchline
MCP Timeline
Okay I said it was going to get fast, look the timeline below!
keep scrolling for the punchline
π Model Context Protocol: Lightning Speed Adoption
LAUNCH
- Anthropic announces & open-sources MCP
- Protocol revision 2024-11-05
- Described as 'USB-C of AI apps'
- Pre-built servers: Google Drive, Slack, GitHub, Git, Postgres, Puppeteer
- Early adopters: Block, Apollo
- Dev tools join: Zed, Replit, Codeium, Sourcegraph
RAPID GROWTH
- MCP takes center stage
- Teams realize data connection challenges
- Adoption accelerates rapidly
- Community contributions grow
ECOSYSTEM MILESTONE
- Over 1,000 open-source connectors
- Rapid community expansion
- Anthropic refines specs & docs
MAJOR ADOPTION
- OpenAI officially adopts MCP
- ChatGPT desktop app integration
- OpenAI Agents SDK & Responses API
- Protocol revision 2025-03-26
- Enhanced specifications
EXPANSION & SECURITY
- Google DeepMind joins
- Demis Hassabis: 'standard for AI agentic era'
- Gemini models to support MCP
- Security researchers identify issues:
- Prompt injection vulnerabilities
- Tool permission concerns
- Lookalike tool risks
MASSIVE GROWTH
- 5,000+ active MCP servers
- Glama directory milestone
- DeepLearning.AI launches course
- 'Build Rich-Context AI Apps'
- Taught by Elie Schoppik
ENTERPRISE READY
- Microsoft Copilot Studio GA
- New features:
- Tool listing
- Streamable transport
- Enhanced tracing
- Quality improvements
CURRENT STATUS
- Mature Ecosystem:
- Multiple SDKs: Python, TypeScript, Java, Kotlin, C#
- Wide industry adoption
- Active open-source community
- Future Development:
- Agent graphs
- Interactive workflows
- Additional modalities
- MCP Registry for discovery
THE FUTURE
- Secure, Context everywhere
- MCP Server begets an MCP server that begets a cluster of MCP Servers and Clients
- The cluster takes over the world and everyone lives happily ever after
π Holy shit, you made it to the punchline! π
You just witnessed the fastest protocol adoption in tech history.
(Your scrolling speed: ~20 minutes. MCP adoption speed: Ludicrous.)
Money Shot
Look at that timeline again. Eight. Months. That's it.
π Holy shit, you made it to the punchline! π
You just witnessed the fastest protocol adoption in tech history.
(Your scrolling speed: ~20 minutes. MCP adoption speed: Ludicrous.)
Money Shot
Look at that timeline again. Eight. Months. That's it.
π¬ "HTTP took three decades to figure out basic security. MCP went from 'hey, here's an idea' to 'Google, OpenAI, and Microsoft are all in' faster than it took SSL to get its first bug fix."
[Tweet this]
Nobody has any idea how many MCP Servers and Clients there are. It's like trying to count rabbits - by the time you're done, they've multiplied. It turns out Clients can be Servers and vice versa. Stay tuned for more on that... It's like your superpowers got superpowers.
Don't be the dullest tool in the shed
So why did MCP spread like wildfire while HTTP took decades to get basic auth working? Two words: proper context.
See, MCP servers are like tools in a shed. A hammer's great for nails, useless for screws. A saw cuts wood, not bolts. Each tool has its purpose, but only when you know when and how to use it. The beauty of MCP is that it lets tools explain themselves - what they do, what they need, what they're good for.
Early 2024, when I was really getting into AI development (okay, getting good at copy paste), I was constantly thinking/saying, it's all about "Context". I spent quite a lot of time trying to figure out how to give models context, my code, documents, pictures from a home inspection, terraform schema, repositories, public legal documents...
Here's the thing: MCP solved this. Instead of me shoving random data at an AI and hoping it figures it out, MCP servers can say "Hey, I handle database queries" or "I'm good with weather data" or "I analyze code repositories." They come with context built-in.
Take my site idea, MCP Shed - tools are rated by "sharpness". Get it? Sharp tools? But without knowing it's a play on "not the sharpest tool in the shed," the whole concept falls flat. That's context.
Just like Terry methodically laying out his tools on that spotless cover, MCP Shed would let you see exactly what each tool does and when to use it. No more guessing, no more shoving random hammers at screw problems.
What do 'AI' and Jokes have in common?
They're both way better with context.
When you tell a joke and end up saying "You had to be there" - lack of context.
When something "goes right over someones head" - lack of context.
When SSL 1.0 gets broken by an edge case nobody thought about - lack of context.
The joke is waste of time! It wastes your time, and your poor audience's time.
It's no different with LLM's/AI/MCP/NewHotness... Without context, it wastes human time and those currently limited computing/energy resources. But WITH context? That's why MCP is spreading faster than gossip in a small town. Finally, AI tools can actually tell you what they're for.
Use context powers wisely
Beware of "too much context".
That friend that asks about that time you did that thing, you know, that thing you did thousands of times with them, you know, that exact time?
The guy on your way out of the office that talks for way too long, sometimes you just can't get away (Claude cancel button, I'm talking to you!).
Moral of the story
Here's what 30 years of SSL/TLS disasters taught us that MCP developers need to know:
Don't repeat history. SSL 1.0 was broken. SSL 2.0 was broken faster than a politician's promise. TLS 1.0? Also broken. Each time, smart people thought they'd patched the holes, but they were building on quicksand.
MCP got it right in 8 months because it learned from those mistakes. But here's the thing - you have to learn from them too.
Terry-level MCP development
I once had the honor of working for a very talented Upholster/Tucker. A lot stands out, but I remember how when he was working on priceless (to me) convertibles - he would meticulously lay out a spotless, soft cover on the trunk, lay out his tools and plan his work.
Everyone that knew Terry would say Terry was 'their guy' - and not in a 'knows a guy who knows a guy' way, but in a 'this is the craftsman you trust with your baby' way. He was experienced, thoughtful, meticulous and most of all, he seemed to know what he didn't. Don't get it twisted, I'm saying the guy knew his limits, consulted manuals/others/took his time and then gained the context he needed. Sometimes that context was, "it's someone else's problem", other times it was "need help", other times it was "waiting for parts", sometimes he just "needed more time".
Unlike those SSL developers who kept patching holes for decades, Terry understood the whole system before touching anything. That's the difference between craftsmanship and "ship it and see what breaks."
You owe it to every Tom, Dick and Eric - every Harriet, Grace, github dev, keyboard hero, nameless contributor who spent decades fixing the web's security disasters.
For the love of all things, when you build your MCP servers:
- Give them proper context - not just data dumps, but meaningful descriptions of what they do and when to use them
- Test your edge cases - SSL kept breaking because nobody thought about the weird scenarios
- Document clearly - if Terry had to guess what tools he needed, your convertible would look like shit
- Think security first - don't ask for passwords in elicitation (seriously, don't be that person)
- Know your limits - Terry knew when to say "it's someone else's problem"
The real moral
Go forth and craft your MCP servers with care. I've been inspired by many craftspeople across many disciplines (I don't use that word lightly). Think about your own personal experiences - that perfect meal, beautiful furniture, zip-tie hack that's just right...
The web took 30 years to get security mostly right. MCP gives us the chance to do it right from the start. Don't waste it building the digital equivalent of a Home Depot deck that collapses when someone sneezes.
Build something Terry would be proud to put his name on.
FIXMEshoddy deck image FIXMEshoddy wiring image FIXMEcrazy ziptie hack image
Bonus Points
If you got this far, thanks! You either have way too much time on your hands, or you're picking up what I'm putting down, if it's both, HMU :).
Here's some cool MCP features you might like, maybe it will be the context to spark your next MCP Server/Client/Combo.
Roots
Roots are something that the "Client" advertises to the "Server". Let's say you are doing something with server and it needs to know what's available.
For example, what project do I have open in my editor/ide/blender/etc. Maybe you have a time tracker widget that you update with which project you are working on. When you change it, your MCP Clients automatically tell the MCP Severs what new list of "resources" are available.
Maybe I run aider or another client and when I ask it to do something, the MCP client tells the MCP Servers where to target it's code search/knowledge base tools.
You could have an MCP server setup that changes the roots on all your active clients
Elicitation
Not that kind of MCP server! Haha - Elicitation, although hard to spell and fun to say, opens some un-imaginable possibilities.
Think of it like this: you've given your MCP servers the ability to raise their hand and ask you questions. Not just any questions - structured ones.
Imagine your MCP server is like a smart assistant that needs clarification:
- "Hey boss, which database should I connect to?" (dropdown: prod, staging, dev)
- "What format do you want the report in?" (multiple choice: PDF, CSV, JSON)
- "Should I include archived records?" (yes/no toggle)
- "How many results do you want?" (number field: 10, 50, 100, all)
You get a nice form popup, and you can:
- Accept: "Here's your answer, now get to work!"
- Reject: "Nope, not doing that right now"
- Cancel: "Forget I asked..."
The cool part? After you accept or reject, the server might come back with another question. Maybe you said "prod database" but forgot to specify the region. Or maybe you entered an invalid number and it's giving you another shot. It's like a conversation, but with forms instead of freeform chat.
Now here's the important bit - these servers can't just ask for your passwords or secrets. That's a big no-no. The whole system is designed to be transparent about who's asking for what, and you always have the power to say "nah, I'm good" and cancel out.
The Security Fine Print (that actually matters)
Since we're building on solid foundations (remember that whole SSL/TLS journey?), elicitation comes with built-in guardrails. We spent 30 years learning these lessons the hard way - Heartbleed, BEAST, POODLE - so let's not repeat them:
- No sketchy questions: Servers literally CAN'T ask for passwords, API keys, or your mother's maiden name
- You see who's asking: Like caller ID, but for your MCP servers - no anonymous requests
- Reject button always works: Don't like the question? Slam that reject button. No explanation needed
- Schema validation: Both sides check that the questions make sense - no asking for your shoe size in a database field
- Rate limiting: Your servers can't spam you with 1000 questions per second (looking at you, chatty database connector)
- Clear requests: No vague "gimme data" - you see exactly what they want and why
Real talk though: When I say "CAN'T", I mean "SHOULDN'T" - like really, really shouldn't. Just like Terry wouldn't use duct tape on a Corvette's upholstery, a good MCP craftsperson won't ask for your passwords through elicitation. The protocol gives you the guidelines, but it's up to the developer to not be that person who ruins it for everyone. Don't be the reason we need MCP 2.0 with mandatory security enforcement, ya know?
Basically, it's like having a well-trained assistant who knows the boundaries and respects them. Way better than the old days of "just put your credentials in this sketchy config file and hope for the best."
I'm launching a simple framework, on top of FastMCP that makes elicitation Easier than getting a targeted ad for something you just whispered about, or cracking a '90's password.
MCP Matryoshka dolls: Servers all the way down
MCP is so composable! Servers can be clients and vice-versa. Think about an MCP server in front of your MCP server...
Remember those rabbits we couldn't count? Turns out they're all connected - and that's the beauty of it. Just like MCP provides context between tools, these architectural patterns provide context between systems.
Imagine MCP gateways that route requests based on context. Or security layers that validate every tool call before it hits your production systems. Multi-tenant MCP architectures where each client gets their own isolated server cluster. Dynamic service discovery where MCP servers find and compose other MCP servers at runtime.
What about MCP-to-legacy protocol bridges? Or performance optimization layers that cache and batch requests across multiple backend servers? Cross-organizational MCP federations where companies can securely share tools without exposing internals?
The rabbit hole goes deep. Enterprise customers are already asking about things that don't exist yet - MCP load balancers, audit trails for compliance, circuit breakers for when upstream servers go haywire.
If you smell what I'm π¨βπ³ cooking, let's talk.
π’ Enjoyed this journey through web history?
Share it with that one friend who thinks they know everything about web development.
(Or that manager who wants to "just add AI" to everything - I like money)
[Share Article] | Jump to MCP Part
π ACHIEVEMENT: FULL STACK HISTORIAN π
You just read 5,000+ words about web security and MCP.
That's like... a small book. About protocols. For fun.
Your rewards:
- β Understanding of 30+ years of web evolution
- β Context for why MCP matters
- β Ability to drop "actually, TLS 1.0 was broken" at parties
- β Permission to build cool shit with MCP
[Tweet Your Achievement]