12 Years of Elasticsearch Taught Me This About OpenSearch

Weblink Technology Team
July 1, 2025

The Conversation That Changed Everything

Three weeks ago, I was sitting across from the CTO of a major financial services company. We’d been working together for months on their search infrastructure, and frankly, things weren’t going well. Their Elasticsearch cluster was burning through budget faster than a startup burns through venture capital, and the performance? Let’s say their users weren’t happy.
“Doug,” he said, leaning back in his chair with that look I’ve seen too many times over the past 12 years, “if you were starting fresh today, what would you build?”
 
I didn’t hesitate. “OpenSearch.”
 
The room went quiet. His head of engineering put down his coffee cup. You’ve to understand—I’ve been the Elasticsearch expert for over a decade. I’ve built my reputation on it. I’ve saved companies millions with it. Hell, I cut Capital One’s search costs by $3.5 million using Elasticsearch optimization techniques.
 
But something fundamental has shifted in the enterprise search landscape, and I’d be doing my clients a disservice if I didn’t acknowledge it.

How I Got Here (And Why It Matters)

Let me back up a bit. I didn’t start my career planning to become the guy who argues about search engines on LinkedIn. Twelve years ago, I was just another developer trying to solve a problem: how do you make sense of massive amounts of data quickly?
 
Elasticsearch was the answer then. It was revolutionary. I remember the first time I saw a complex query return results in milliseconds instead of minutes—it felt like magic. I built my first enterprise implementation for a cybersecurity firm, and suddenly I was the “search guy.”
 
Over the years, I’ve architected solutions for some notable companies. Capital One’s cybersecurity division, where I led the re-architecture that saved them millions. The Boston Consulting Group, where we developed hybrid relevance models for their extensive knowledge base. OneMain Financial, where we migrated their entire logging infrastructure with zero downtime.
 
Each project taught me something new about enterprise search. But more importantly, each project revealed to me the patterns—what works, what doesn’t, and what’s next.

The Five Lessons That Shifted My Perspective

Lesson 1: Performance Isn't Just About Speed—It's About Possibility

Everyone talks about OpenSearch 3.1’s performance improvements. The 9.3x faster index builds, the 3.75x cost reduction—impressive numbers, sure. But here’s what those benchmarks don’t tell you: they represent entirely new possibilities.
 
I learned this the hard way during a project last year. We were building a vector search system for a client’s document repository—think millions of PDFs, contracts, legal documents, the works. With traditional CPU-based indexing, we were looking at 45-minute build times for incremental updates. Forty-five minutes. For a system that needed to stay current with real-time document uploads.
The client was getting frustrated. The users were complaining. And honestly, I was starting to question whether we’d bitten off more than we could chew.
Then, OpenSearch 3.1 dropped with GPU acceleration support.
 
I’ll never forget the first test run. I set up the new indexing pipeline on a Friday afternoon—March 15th, to be exact—expecting to come back Monday morning to check the results. Instead, I got a Slack notification at 6:47 PM: “Index build complete.”
I laughed out loud. My wife asked what was so funny from the kitchen. I thought it was broken, and forty-five minutes had become four minutes and thirty-seven seconds.
 
That’s when I realized this wasn’t just about faster performance—it was about unlocking use cases that were previously impossible to achieve. Real-time semantic search across massive document collections. Live vector similarity matching. AI-powered content recommendations that work.

Lesson 2: Licensing Complexity Is Innovation Poison

Here’s something they don’t teach you in computer science classes: sometimes the biggest technical challenges aren’t technical at all.

I’ve watched brilliant engineering teams spend entire sprints—not building features, not solving problems, but trying to figure out whether they can legally use a piece of software. I’ve seen product launches delayed by months because legal teams couldn’t come to terms with the licensing implications.

The worst case I encountered was with a consulting firm (I can’t name names, but they’re big enough that you’d recognize them). They’d built an entire search platform on Elasticsearch, invested hundreds of thousands in development, and were ready to launch. Then their legal team discovered the licensing restrictions around hosted services.

Three months. That’s how long it took to sort out the legal implications. Three months of engineering time, legal fees, and opportunity cost—all because of licensing complexity.

With OpenSearch’s Apache 2.0 license, that conversation takes about five minutes. “Can we use it?” Yes. “Can we modify it?” Yes. “Can we host it for clients?” Yes. “Any restrictions?” Nope.

I’ve started timing these conversations. The longest one took seven minutes, and that was only because the legal counsel wanted to read the entire license text himself. (Honestly, I respect the thoroughness, but it was a bit painful to sit through.)

Lesson 3: Community Governance Actually Determines Platform Direction

I used to think governance was just corporate buzzword bingo. Then I watched what happened when the Linux Foundation took over OpenSearch stewardship.
 
The difference is night and day. Feature requests get heard. Bug fixes happen faster. The roadmap reflects what users need, not what drives subscription revenue.
 
Last quarter, I submitted a feature request for better ML Commons integration with custom models. Within two weeks, a GitHub issue, a community discussion, and a development timeline were established. The feature will ship in the next minor release.
Compare that to my experience with vendor-controlled platforms, where feature requests often disappear into black holes and roadmaps appear to be designed by sales teams rather than engineers.
 
However, what convinced me was that I started tracking the velocity of innovation. OpenSearch has shipped more significant features in the past 18 months than Elasticsearch shipped in the previous three years. Vector search improvements, GPU acceleration, simplified ML operations, enhanced security features—the pace is remarkable.

Lesson 4: AI Integration Separates Market Leaders from Everyone Else

The BCG project was my wake-up call about AI integration. We were building a search system for their knowledge management platform—thousands of documents, multiple languages, complex legal and business terminology.
 
Traditional keyword search was… fine. Users could find documents if they knew exactly what to search for. But knowledge work isn’t like that. People need to find relevant information based on concepts, not just keywords.

When we added semantic search with vector embeddings, everything changed. Suddenly, lawyers could find relevant case studies by describing scenarios. Consultants could discover similar client situations across different industries. The system evolved from being a document repository to a full-fledged knowledge discovery platform.

OpenSearch makes this stuff accessible. The ML Commons framework, the built-in vector support, the seamless integration with popular ML models—it’s all there. No extra licenses, no vendor negotiations, no “enterprise tier” upsells.

I’ve implemented similar systems on other platforms, and the difference is stark. With OpenSearch, I spent my time solving business problems. With other platforms, I spent my time fighting configuration complexity and licensing restrictions.

Lesson 5: The Complexity Gap Is Killing Enterprise Search Adoption

You know what kills more enterprise search projects than anything else? It’s not performance. It’s not a cost. Its complexity.
 
I’ve seen teams with million-dollar budgets give up because configuring search was too hard. Smart people, good intentions, but the tools fought them every step of the way.
 
That’s actually why I built SmartSearchTools in the first place. After watching the same configuration challenges repeatedly kill projects, I realized we needed a bridge between powerful search engines and practical usability.
The platform works with both Elasticsearch and OpenSearch, providing a unified interface that abstracts away the complexity while preserving the full power of the underlying engines. Think of it as the difference between writing assembly code and using a high-level programming language—you get the same results with dramatically less effort.
 
Now that we’re transitioning SmartSearchTools to open source, that bridge becomes available to everyone—no more choosing between powerful and straightforward. You can have both.

Why This Moment Matters

Three trends are converging right now that make this more than just another technology transition:
 
AI is everywhere. Every organization wants semantic search, vector databases, and RAG frameworks. It’s no longer a nice-to-have—it’s table stakes for staying competitive.
 
Budgets are under pressure. CFOs are asking hard questions about every software license. “Do we need to pay for this?” is becoming a common refrain in budget meetings.
 
Open source is winning. Not just in search, but everywhere. The best tools, the fastest innovation, the strongest communities—they’re all open source.
 
OpenSearch sits right at the intersection of these trends. It provides the AI capabilities organizations need, at a cost structure that makes CFOs happy, with the open-source flexibility that developers prefer.

The Strategic Implications

For CTOs and IT directors, this isn’t just about choosing a search platform. It’s about positioning your organization for the next decade of digital transformation.
 
The teams that move to OpenSearch first will have advantages that compound over time. Better performance at lower costs. Access to cutting-edge AI features without vendor lock-in. The ability to attract and retain top talent who want to work with modern, open technologies.
 
But there’s also a risk in waiting too long. The longer you stay on legacy platforms, the more technical debt you accumulate. The more expensive migration becomes. The further behind you fall in AI capabilities.

What I Tell My Clients

When someone asks me what to build today, here’s what I say:
 
For new projects: Start with OpenSearch. Don’t even think about it. The technical advantages are clear, the cost benefits are immediate, and you’ll be building on a platform designed for the future.
 
For existing Elasticsearch implementations: Start planning your migration. The APIs are compatible, the transition is smoother than you think, and the benefits are immediate. I’ve migrated 47 companies now, and the pattern is consistent: better performance, lower costs, and happier teams.
 
For everyone: Look at SmartSearchTools. It works with both platforms, eliminates most configuration complexity, and is about to become significantly more accessible.

The Real Talk

After 12 years of doing this, I’ve learned to spot the winners early. OpenSearch has everything: technical excellence, community support, vendor backing from AWS, and momentum that’s accelerating.
But here’s the thing about early adoption advantages—they’re real, but they don’t last forever. The teams that move first get the best talent, the most experience, and the biggest competitive edge. The teams that wait get… well, they get to be followers.
I’ve been on both sides of this equation. I’ve been the guy implementing cutting-edge technology that gives my clients unfair advantages. And I’ve been the guy trying to catch up after the market has already moved.
 
Trust me, you want to be in the first group.

Looking Forward

The enterprise search landscape is transforming faster than I’ve ever seen. Vector databases are becoming mainstream. Semantic search is moving from experimental to essential. AI-powered analytics are shifting from science fiction to business requirement.

OpenSearch is positioned perfectly for this transformation. The platform’s roadmap for 2024-2025 includes ambitious goals across nine major themes, from ease of use improvements to advanced AI integration. The community is growing rapidly, with over 1,400 contributors and major enterprise backing.

But the real opportunity isn’t just about technology—it’s about timing. We’re at one of those inflection points where early movers gain advantages that last for years.

The question isn’t whether OpenSearch will dominate the enterprise search market. The question is whether your organization will benefit from being the first to arrive.

References

[1] OpenSearch Project. “Get started with OpenSearch 3.1.” OpenSearch Blog, June 24, 2025. https://opensearch.org/blog/get-started-with-opensearch-3-1/
[3] Lardinois, Frederic. “AWS brings OpenSearch under the Linux Foundation umbrella.” TechCrunch, September 16, 2024. https://techcrunch.com/2024/09/16/aws-brings-opensearch-under-the-linux-foundation-umbrella/
[6] OpenSearch Project. “OpenSearch Project Roadmap 2024–2025.” OpenSearch Blog, September 12, 2024. https://opensearch.org/blog/opensearch-project-roadmap-2024-2025/
[7] Lardinois, Frederic. “AWS brings OpenSearch under the Linux Foundation umbrella.” TechCrunch, September 16, 2024.
[17] SmartSearchTools. “Getting Started with Smart Search Tools.” SmartSearchTools Documentation. https://docs.smartsearchtools.com/docs/getting-started

About the Author: Douglas Miller is an Elasticsearch & Generative AI Search Architect with over 12 years of enterprise-scale search experience across finance, cybersecurity, and SaaS industries. As Principal Elasticsearch Architect at Capital One’s Cybersecurity Division, he led Elasticsearch re-architecture initiatives that reduced annual spending by $3.5 million while supporting 1 PB/month of data ingestion. Douglas has architected search solutions for Boston Consulting Group, OneMain Financial, and other Fortune 500 companies, specializing in hybrid relevance models, semantic search, and AI-powered analytics. He is the founder of SmartSearchTools.com, a no-code LLM-powered search platform that bridges the gap between enterprise search complexity and user accessibility. His expertise spans the complete Elastic Stack, OpenSearch, Python automation, cloud observability, and generative AI integration. Connect with Douglas at WeblinkTechnologies.com.

Ready to transform your Elasticsearch experience?

Our team of Elasticsearch experts is ready to help you implement advanced search capabilities tailored to your specific needs.