In today’s data-driven world, organizations are faced with the daunting task of managing and analyzing vast amounts of data, often employing artificial intelligence, machine learning, or other forms of advanced analytics. WinterCorp helps executives with their most critical decisions in the creation, modernization, and cost optimization of such systems, typically starting with the foundation: the data platform or ecosystem.
As an independent consulting firm with over 30 years of experience, WinterCorp has a proven track record of solving the toughest problems in applying analytic data technologies at the scale required in the largest and most data-intensive enterprises. Among our clients are the largest banks, the most data-intensive government agencies, and leading companies in pharma, healthcare, retail, shipping, and other industries.
The Face of WinterCorp Richard Winter CEO Opens Up in Exclusive Interview
WinterCorp is recognized as one of the Top 5 Data Architecture Consulting Firms. How does this accomplishment make you feel, as the company’s CEO?
I’m honored! I feel that this award brings recognition to the outstanding effort and long-term, highly focused investment our team has made to provide the best possible solutions and advice to our customers – and to the deep expertise and know-how we bring to our engagements. We have spent decades measuring and analyzing the architecture and performance of analytic data platforms for the data warehouse and data lakehouse against a wide variety of demanding customer requirements.
Tell us about the incredible story of the company and its journey so far.
I started the company in 1992 with the goal of helping executives with strategic decisions concerning the architecture, engineering, and operation of large and complex databases. This was based on my prior work as the CTO of a pioneering database vendor, Computer Corporation of America (CCA), that was focused on these same areas.
Soon, I was approached by the CIO of a freight company, now part of FedEx, that was maintaining a large-scale, near real-time database to support both operational and analytical requirements. His company was growing rapidly, and he wanted my help in selecting a next-generation platform for this database. This database was required to be online, accepting updates and queries, 7×24, 364 days a year. Data latency was low—just a few seconds—and query complexity was remarkably high. Other databases of this sort existed in the industry but were able to process only the simplest queries. This customer explicitly wanted the capability to handle quite complex queries on voluminous near real-time data.
Our team created a realistic benchmark including a synthetic version of his data and his workload, factoring in five years of expected growth. We invited proposals from the leading vendors of the day, focusing on the parallel database architectures that were at that time relatively new in the market. Several vendors claimed they could meet the requirements, but only one could demonstrably satisfy the two central criteria: successfully performing the large-scale custom benchmark test, and showing how the solution could continue operating without interruption or degradation through every known mode of failure. As a result, we were able to recommend the one solution that was consistent with the customer’s critical business interests.
Since then, we have repeated variations of this story about 50 times in different industries, usually in situations where the customer is near or beyond the frontiers of previous experience. We are sought out by customers who are planning a large-scale analytic data system – now often in the cloud and now often involving AI – and who recognize that their investment will be at risk if they build upon a platform that cannot scale efficiently to support their solution. These executives are aware that the consequences of building your analytic data solution on the wrong platform can be devastating. On the other hand, a good fit with your specific business needs and interests can give you a large strategic advantage.
Digital Skyscrapers
A metaphor for these projects is that our customers are creating the skyscrapers of the digital world: that is how I think of the larger data warehouses, lakehouses, and similar systems. They are like skyscrapers in their scale and the enormity of the investment involved. Skyscrapers have issues that are rarely significant in ordinary buildings: a skyscraper can literally collapse if the foundation is not right or if proper allowances are not made for the wind.
No one would build a skyscraper without expert advice and testing from structural engineers and architects who specialize in such projects. Similarly, no one should implement an enterprise-class data warehouse or similar system – whether on-prem, in the cloud, or hybrid – without expert advice and testing from a firm like ours.
The chart below shows how the right platform can make such a big difference when operating at enterprise levels of scale. In this example, a substantial insurance company was considering moving its data warehouse to the cloud. Boris Zbitsker, a WinterCorp business partner, used a sophisticated, proven performance model to project their annual cloud bill on each of the three popular cloud data platforms under consideration. He showed that the best choice (platform C) for this company was about 5 times less costly than platform A and 3 times less costly than platform B. In this case, platforms A and B are better known and more widely used. However, they don’t work well for this customer’s combination of database and workload.
“If the customer had simply ‘gone with the market trend’, the result would have been a waste of $12 million per year.”
What has been the biggest contribution to the company’s success? What are you focused on right now?
Quantitative Approach: There are three keys to our success. First, is our quantitative approach to platform evaluation and other major architectural and/or design decisions. Our typical project involves quantitative analysis against the customer’s specific requirements which we derive from the customer’s business interests and vision – often using sophisticated modeling and/or custom testing. We give our customers recommendations supported by concrete tradeoffs concerning cost of operation, performance, availability, elasticity, and other factors that they view as critical to their business interests and strategies. We also make sure that the platform we recommend can support the “digital skyscraper” to be built upon it.
Business Outcome Focus: The second key is that all of our work is outcomes-focused, grounded in the business needs of key stakeholders in the customer organization. We make a thorough analysis of critical business needs and interests and make certain that our understanding of requirements is anchored in the outcomes that are most critical to the client. We have developed our own unique techniques for interviewing business stakeholders, developing business scenarios, and deriving quantitative technical requirements from the shared business and technical vision of those involved. Our approach is illustrated in the chart below.
Here is an example of that: in our freight company engagement, the manager of a freight terminal needed to know what volume of freight was going to arrive at his or her terminal during the coming eight-hour shift. The manager needed to know this exactly two hours before the start of the shift so that the right number of workers could be brought on. But it wasn’t just the total volume: there were several classes of freight that required special certified skills and special handling (e.g., hazardous freight, refrigerated freight, fragile freight, etc.). So accurate predictions were needed. These predictions depended on real-time events affecting hundreds of incoming and outgoing vehicles whose arrival and/or departure might be affected by weather, traffic, mechanical problems, personnel problems, or other events. This specific requirement was expressed in a detailed business scenario developed jointly with the client team. After defining the scenario, we were able to identify the characteristic queries and analytics involved and project the frequency with which they were likely to occur. Similarly, we jointly created projections of the frequency and nature of freight movements that were being tracked in the database.
The result was a complex analytic driven by high volume, high-velocity data coming from many sources and changing every few seconds. In the mid-1990s, this was a very unusual requirement on a large scale. We were able to devise realistic, faithful large-scale tests of the platforms and reveal strengths and weaknesses that would not have been revealed any other way.
Most, if not all, other consultants do not approach these projects in this way. They use standard software evaluation techniques based on feature lists, often irrelevant surveys, and standardized benchmark data that rarely capture the complexity of real customer requirements. Most consultants make architectural recommendations based solely on logical and conceptual factors and principles.
They rarely address the question of whether the foundation is capable of meeting the engineering requirements of the customer’s specific digital skyscraper as envisioned – that is, can the platform under consideration support the scale and complexity of the database and workloads as they will be built out over five years? Will they support the customer’s business strategies and solutions at scale? Are they up to the demands of generative AI, machine learning, and other advanced analytics at large scale? Will they be cost-effective and performant? Will they be sufficiently available in the presence of all types of failures and errors? Will they meet the operating schedules and service levels required? Will they be agile, elastic, and able to change as requirements change? Will they be manageable?
“We answer these questions decisively, with analysis, modeling, and testing. Most consultants do not.”
We find very large differences between the options under consideration. This is a key part of the value we provide to our clients.
We Don’t Partner: Unlike most consultants and system integrators, we don’t partner with the vendors we evaluate. When we are retained by a customer organization to develop and evaluate options, we are able to provide unbiased, independent advice precisely because we choose not to partner and therefore not to benefit financially by recommending vendor A over vendor B. This is clearly different from most others.
What have been the most significant milestones and achievements so far for WinterCorp?
Our first milestone was proving our approach works in practice for a large organization with demanding business and technical requirements. We did that in the freight company engagement I described in the prior question.
Our second milestone was proving that our approach would work in a variety of industries and with very different technical challenges. We did that in the following ten or so major engagements we conducted over the following decade. In that period we had similarly successful engagements with leading banks, insurance companies, retailers, pharma companies, healthcare providers, and government agencies.
Our third major milestone was proving that we could adapt our methods and approach to the massive changes in technology and business practice that have occurred over the last thirty years. We are doing our architectural analysis, testing, and measurement on systems today that are millions of times larger and a thousand times more complex than the systems we worked with in the 1990s. Today’s systems are often in cloud, hybrid, or multi-cloud configurations; they may be centralized or distributed; they often involve machine learning or GenAI in the database; they can involve open table formats, such as Iceberg; they often involve a data fabric and/or data products; and technology continues to advance rapidly. These advances and changes are valuable to many customers, but they also increase the complexities that customers must cope with.
We keep up with these technologies – and with the data platforms on which they are implemented – by studying their architecture, testing them, measuring them, and gaining hands-on experience in implementing them. As a result, our ability to assess, quantify, and advise on the risks, performance, and cost factors is more in demand than ever.
Reducing Cloud Charges. There has been one other key development in the past few years: some customers are struggling with unacceptably large cloud bills and/or unacceptable system performance in the cloud. This typically happens only with customers who are operating at a large scale in the cloud.
These are customers who did not consult with us when they chose their cloud data platform. We have been called in by such customers when they discover that their cloud data warehouse or lakehouse is costing too much or not functioning as expected. In these situations, we have been able to help mitigate the problems short term and then collaborate with the customer on a strategy to overcome the challenges on a long-term basis.
“Our recommendations for cost reduction result in an excellent return on investment and we quantify it in advance.”
A particularly valuable asset we bring to such situations is Dr. Norbert Kremer, who is a certified FinOps practitioner as well as an expert in cloud data platform architecture and performance. FinOps is the emerging discipline of cloud cost management. Unlike most other FinOps providers, we are able to combine cloud cost analysis with deep expertise in architecture and performance to create solutions for the client that reduce cost while remaining aligned with business and architectural objectives.
Could you tell us a bit about your educational and professional background and how it contributes to serving your clients?
My training as an engineer was at Case Western Reserve in Cleveland and the University of Michigan College of Engineering in Ann Arbor. I feel that this helped me develop an analytical and measurement-based approach to evaluating options and finding the best solution. I also took graduate courses in business as a special student at the MIT Sloan School.
My early career was devoted to pioneering research and development, including hands-on coding, in the creation of database products and solutions at CCA, a startup company near MIT. In this work, I helped create a database product that came to be used on 1,000 of the largest and most complex databases in the world. As one of the developers of the product and later the CTO, I was involved in all aspects of development, support, sales, marketing, and implementation. Customers coming to me for advice on critical decisions gave me the idea for the consulting practice that I lead to this day.
Unlike most consultants, I understand in detail how database engines work; why they all have limitations and differ from one another in specific ways; and why it can be critical to a customer – especially one operating at a large scale – to have the right fit. This understanding is the basis for the unique approach to quantitative platform evaluation and architecture that my team and I have developed in our practice.
What are the biggest opportunities and obstacles you see for innovation in the data architecture industry?
The biggest opportunities in data analytics now are in the application of AI and machine learning at a large scale. These technologies are changing the way companies compete in analytics. Because of our deep expertise in the architecture and measurement of performance and efficiency at large scale, we are in a position to help customers apply these technologies for competitive advantage.
In addition, some of our customers now implement distributed or multi-engine data ecosystems based on such concepts as a data fabric, a data mesh, data products, and/or federated data query. When large-scale, heavy workloads demanding operating requirements or concerns about operating costs are involved, our special expertise comes into play.
“Customers who have the right data platform for their needs will save tens of millions of dollars in system costs if not hundreds of millions.”
Customers with the right data platform(s) and architectures will be able to leverage AI and machine learning on enterprise data at a large scale. They will be able to build databases and solutions that are agile and efficient, integrated and manageable, scalable and up to date, and able to satisfy their operating requirements. They will be able to create a competitive advantage grounded in advanced analytics, data science, and AI/ML.
Customers who have the wrong data platform will waste many millions of dollars, struggle to move forward, find they cannot integrate and leverage data across their business, and fail to apply AI and machine learning at a large scale (they won’t be able to afford it! Or it will take much too long to implement or run solutions), and they will not be able to compete in their markets!
So, what is the opportunity for data architects like WinterCorp who use a quantitative approach? They can help organizations meet the key technical and business challenges of the coming decade.
The obstacles? In our focus area, the key obstacles are scale and complexity – the challenges of the digital skyscraper. To guide customers to good solutions, you need to understand – and be able to address – the scale and complexity involved in building data solutions for large enterprises. These issues are largely ignored in most education and training. Few data professionals – even those who have built successful solutions at a more modest scale – are insufficiently aware of these issues and will build solutions that fail at enterprise levels of complexity. We understand these issues through our decades of hands-on analysis, measurement, and implementation at the frontiers of scale and complexity. Usually, we collaborate with the customer’s internal team of data professionals, complementing their expertise with our specialized knowledge of engineering issues at a large scale.
In light of the industry’s volatility, what approach do you undertake to develop robust data architecture strategies for clients?
We address rapid change in a specific way. We collaborate with our clients in envisioning the business processes and strategies that the data ecosystem will support in the strategic time frame of the client (typically three to seven years). We create a specific business vision together and then a shared understanding of the role of the data repository(ies) will play in enabling and supporting that vision. We then create quantitative estimates of the macro requirements at appropriate intervals: for example, we will estimate database size and structure at one, three, five, and seven years. We’ll make similar estimates for known workloads. We’ll then factor in industry trends and build in a margin for requirements that cannot be estimated in advance.
In our experience – after doing this with clients for more than thirty years – we know that there will always be some changes you cannot anticipate. But, if you plan for those that you can see are likely – build in a margin for unknowns – and use sound principles to provide for ongoing change – the outcomes are good to excellent. The customers who have the most trouble are the ones who close their eyes to a business need that is obviously coming within a few years.
And, of course, there are many customers who completely ignore the implications of scale, accepting on faith their vendor’s claims that their platform is scalable. What the vendors don’t say is that most platforms scale well only for simple problems. It is an entirely different question whether a platform scales well for more complex requirements. These are problems you can avoid with our approach.
Please explain briefly the process you follow to improve a company’s existing data architecture.
Primarily, we help the customer select the platform or combination of platforms – or design the distributed data ecosystem – that is right for their enterprise-class data and analytic requirements. In brief, our process is:
- Define the business vision and desired outcomes
- Define the role of the data lakehouse/warehouse/ecosystem in enabling the vision/outcomes
- Define the quantified macro requirements
- Draw up a short list of candidate platforms/architectures
- Develop evaluation criteria, including quantified requirements regarding scale, performance, cost of operation and other engineering factors
- Assess the risks and determine what analysis, modeling and testing is appropriate to sufficiently manage the risks
- Define the evaluation process, schedule and budget
- Solicit proposals
- Conduct evaluation including any testing or modeling required; determine differentiators, tradeoffs, pros, cons
- Develop platform recommendation, showing how it relates to the business issues and outcomes critical to the client
- If appropriate, make recommendations about database and solution architecture
- If desired by customer, assist in design and implementation until desired outcomes are achieved
We have a variation on this process for a customer who already had adopted a platform or architecture and is having or anticipating problems as the system gets built out or phased into production.
How do you keep yourself informed about the latest technologies and developments in data architecture? How do you develop an in-depth understanding of the market?
We have a variety of activities that feed into our understanding of new developments and market needs.
First, we have our own self-funded R&D program in which we work hands-on with new products and new features; experiment with them; and measure them. Most of this work is done in the WinterCorp Cloud Database Lab, under the direction of Dr. Norbert Kremer.
Second, we maintain relationships with leading vendors and startups in our area of focus and spend time with their senior technical staff to understand their architectures, differentiators, performance, pricing, and product direction. We have private briefings with them under NDA as well as attending their conferences and special events for analysts and consultants.
Third, we learn through our engagements when we are exploring the frontiers of scalability and product performance on behalf of our customers, many of which are the leading companies in their industries.
Fourth, we conduct executive round tables with data leaders in larger enterprises (commercial and government) in which we learn about their challenges and needs.
Fifth, we teach masterclasses to data professionals and workshops to data leaders under the auspices of tdwi.org (Transforming Data with Intelligence). This brings us into contact with hundreds of data professionals and data leaders a year in interactive settings and provides us with opportunities to learn about their challenges and issues.
Sixth, in addition to my role as CEO of WinterCorp, I am also the Research Director of ACAN, the Analyst and Consultant Advisory Network (acadvisorynetwork.com), a forum for the presentation of new products and technology in data, AI, and analytics. This brings me into direct contact with thirty or forty vendors a year – established companies and startups – each of whom has a significant technology or new feature that they want to bring to the attention of independent analysts and consultants.
Seventh, we publish research, speak at conferences, and are active on LinkedIn, which results in many conversations with customers, vendors, other experts, and practitioners.
How do you motivate your team to stay focused on delivering excellence?
I select my team with great care and bring in only people who have a commitment to excellence and a career focus on our special interest in data platforms and architectures for the customer with enterprise-class requirements for scale and complexity.
And, our engagements typically involve very interesting and challenging problems – often first-of-a-kind – entailing demanding requirements in this area. As a result, the typical engagement is fascinating to me and our team, and there is little difficulty with motivation. It is always crystal clear how important it is to our client to have a solution that will work, so we make a highly motivated team.
How do you maintain your competitive advantage in the market?
Our competitive advantage results from our long-term, consistent focus on the most demanding requirements. Rather than aiming to serve the entire market, we are focused on that small percentage of organizations that is building a digital skyscraper. This is a very unusual focus for a consulting firm.
We have more experience quantitatively understanding the requirements, platforms, and solutions in this area of focus than any other consulting company I have come across. Other consultants typically understand one or two platforms – and they rarely, if ever, work at genuinely high levels of scale and complexity. We understand how to compare, analyze, and test multiple platforms and how to find the best fit for the customer. No one else seems to have the depth and quantitative focus on the evaluation process that we have.
I believe this is because most consulting companies are focused on growing revenues as fast as possible. If this is your goal, it makes more sense to partner with one or two platform vendors and sell your services into their aftermarket.
Our strategy is to be very selective and focused; deliver extremely high quality, highly differentiated services; deliver customer value and satisfaction; have the most interesting work; and, make a profit. We are interested in growth, but only as a consequence of the success of our customers. As long as we can continue to succeed in our mission, it is not important to us to create a large business. This allows us to prioritize quality over quantity.
What changes in the industry do you think businesses can look forward to in the next year?
I think the biggest changes will be driven by the widespread interest in—and rapid technical change with respect to—AI. Virtually every company is interested in the use of GenAI on enterprise data. That will create new demands on the data platform. Also, since many AI strategies involve classic machine learning in addition to GenAI, increased use of classic machine learning on enterprise data will drive new demands. In fact, I think many companies will need to reconsider their data platform direction in order to implement AI at the scale required for their business. We offer an assessment, based on our quantitative approach, to help companies determine if their current infrastructure is going to get them where they want to go. If the answer is no, we can collaborate with the customer on how to modify the platform strategy and direction.
“Often, we can recommend a supplemental platform to address higher-scale AI requirements.”
Typically, solving problems involving mathematics requires classic machine learning rather than GenAI. Many business problems involve both the language, symbol (e.g., image), and text analysis/generation provided by GenAi and the data science delivered with classic machine learning. We are investing in analyzing and measuring both technologies as they are implemented in enterprise-class data platforms and architectures. It appears that causal AI is emerging soon, and this will have a further impact on data platforms and architectures.
The emergence of open table formats such as Iceberg will open the door to the use of multiple query engines in one data ecosystem or lakehouse. I think many customers will soon be evaluating options along these lines.
For example, a customer may be building out a data lakehouse, fabric, or mesh on a popular product that is good for databases and workloads of moderate scale but very costly at a larger scale. This customer could benefit from introducing a second query engine for larger workloads or databases – thereby saving tens of millions of dollars in operating costs. We will be able to help these customers select the query engine that really delivers the savings or performance they are seeking.
What excites you about the future of the company? What are your short-term plans and long-term goals?
Today, I believe we are the leading experts in the architecture of the digital skyscraper – the data lakehouse, data warehouse, or analytic data ecosystem that is needed by the companies leading the way with new and better business strategies that leverage AI, advanced analytics, and data to compete at large scale. We aim to continue our success in the near term with more exciting engagements involving GenAI, classic machine learning, and other advanced analytics at a large scale.
My goal is to extend that lead to fully encompass the advances of the next five to ten years in all varieties of AI, hardware (e.g., GPUs, quantum computing?), and other advanced technologies in data and analytics. I’m expecting the use of causal AI on enterprise data to be one major trend here. These technological advances will be the ingredients of the data architectures that our leading customers will want to field, in ways that deliver value and have an impact on their business. We aim to stay in the lead as these advances – and others yet to emerge – over the next decade.
A second major theme about to come in for more attention is the cost of operation. Many large customers are already concerned about the operating costs of their analytic data ecosystems, especially in the cloud.
“AI, if implemented without concern for scalability and efficiency, is going to drive those costs through the roof.”
We can help customers move forward into the age of AI with architectures that enable efficient, scalable implementation of AI on enterprise data. That is going to be a critical success factor in many larger businesses.
Would you like to give any advice to young and emerging entrepreneurs who are about to embark on their business journey?
Focus on differentiating your product or service in ways that deliver business value for your customers – and make sure that the value is appreciated by business stakeholders that have the clout to sponsor its implementation. Many startups fail by selling features that the customer does not perceive as critical to the business – or that no one with business responsibility sees as relevant to their most critical concerns.
The typical startup I talk with has some “sexy” technical concepts but little understanding of how the customer will apply it to deliver value.
Even today, twenty years after the emergence of classic machine learning in the enterprise, most machine learning projects fail or stall permanently between the pilot stage and production use in a business process. There is no question that machine learning is of immense value when applied to enterprise data – still only about 30% of companies actually succeed in realizing that value, even 20 years after market introduction.
I was the product leader of an early database company as our customer base was expanded from three to one hundred customers. This was at a time when few customer executives even understood the idea of having a database shared by multiple users and applications – it was a complex idea to communicate and even more challenging – as a small startup – to convincingly differentiate our platform from older, better-known platforms from bigger companies.
I can tell you that learning how to describe your product, its most significant advantages, and its business benefits – and then demonstrate them to an often skeptical audience (because they hear about many new products) – is one of the great challenges of any advanced technology. Customers may value the apparent magic of the technology, but that only swings sales if they also believe that they will realize business value when it is put into use. And they must believe that the delivered business value with your solution is greater than with the alternatives.
Richard Winter, CEO
Data management industry expert Richard Winter has unmatched data platform/solution know-how and experience. He has defined data requirements, developed data strategies and architectures, selected data platforms, and engineered data solutions for leading enterprises.