There has never been an easier time to find a software solution to solve a problem or capitalize on an opportunity.
There are countless resources available to help us size up, compare and contrast any kind of solution.
Eager technologists faced with legacy challenges are quick to procure what sometimes ends up in project-based solutions. The proliferation of these software solutions presents a challenge in and of itself. The more solutions there are, the more complexity there is to manage them—and the more moving pieces there are to try to orchestrate in order to limit the impact on user experience.
We (naturally) procure solutions on the basis of performative actions, with little thought to how they contribute to the overall technical and data landscape. However, this is essentially only doing half of the job.
We also lean toward the known products that have good reviews and have been around for a while. The challenge with this relatively ingrained approach is that we are often compounding the problem. Many of the "popular" solutions are older and use outdated technology. They weren't built for interoperability or to exist in a large ecosystem. They were built to be stand-alone, leaving organizations stuck in heavily invested yet incomplete or ineffective solutions.
The company may have spent a lot of money on its infrastructure but is still left with the same nagging challenges, buying more products and services from the existing vendor that are "good enough" rather than newer interoperable products that are leading their respective domains.
With so many legacy dinosaurs occupying significant amounts of the software market, it's no surprise that disruption is happening in all industries.
However, before we throw out our aging stack, noting that "rip and replace" is more painful than most of us have the stomach for, there is a much easier path to take that starts with first principles of data. How can the data we currently have be connected to other business data and made available for use in other parts of the organization? If we are procuring a new solution (regardless of performative action), how can it further these goals, both with its own data and how it leverages complementary available data?
The adoption of flexible, connected data models is booming—and for good reason.
Interoperability, connectivity, data management and machine learning are driving a new generation of software solutions. While some industries have understood the risk of being late, and the incredible opportunity of being early, there are some industries that can't see beyond the surface of performative actions and traditional approaches.
As buyers, we have to look beyond the face value of solution functionality to understand how the product is built, how data is managed and how it is designed to integrate and leverage adjacent solutions. This forces us to move beyond project-based thinking and consider the strategic goal of how each piece of software contributes to the full technical capability of the organization. Thinking in this way will create efficiencies and more effective software purchase decisions.
A first step in this direction is to get an overview of the current stack and find the gaps. Discover which data is only being used for performative actions, which data is duplicated and which data could offer greater value (e.g., in support of analytics). There are some simple data management techniques that can support this process such as data cataloging and mapping. With this picture, organizations can take action to achieve efficiencies, generate value and make better investments in the future.
From the vendor side, the new generation of tooling cannot disregard the context that buyers are currently working in. They need solutions that can not only connect to their existing stack but can leverage the data held by those legacy systems. New software investments should not just provide future value but extend the value of the previous investment. This is where innovation can bloom—enhance what you have while introducing new functionality and tooling that positions for the future.
There are many exciting industries ripe for innovation—in particular, identity and access management (IAM). This is an industry that has used the same underlying technology for decades despite the evolving needs of the workforce and customer interactions, and it stands to benefit significantly from fresh tooling. And fresh tooling is coming, Gartner Inc. predicted (paywall) that within the next three years, organizations that adopt data management practices in support of their IAM program will realize a 40% improvement in time to value delivery.
At the same time, we are also seeing a massive consolidation of the IAM industry (including my former company), which will likely result in the above-mentioned legacy challenges compounding as market leaders are stripped and squeezed.
This is a perfect environment for new players to disrupt with new data-driven tooling, but it might create challenging conditions for customers.
Dinosaurs will become extinct in one way or another. You don't want to be their customer when it happens.