MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
data
Search

7 reasons analytics and ML fail to meet business objectives

Monday July 15, 2024. 11:00 AM , from InfoWorld
Foundry’s State of the CIO 2024 reports that 80% of CIOs are tasked with researching and evaluating possible AI additions to their tech stack, and 74% are working more closely with their business leaders on AI applications. Despite facing the demand for delivering business value from data, machine learning, and AI investments, only 54% of CIOs report IT budget increases. AI investments were only the third driver, while security improvements and the rising costs of technology ranked higher.

CIOs, IT, and data science teams must be careful that AI’s excitement doesn’t drive irrational exuberance. One recent study shows that the most important success metrics for analytics projects include return on investment, revenue growth, and improved efficiencies, yet only 32% of respondents successfully deploy more than 60% of their machine learning models. The report also stated that over 50% do not regularly measure the performance of analytics projects, suggesting that even more analytics projects may fail to deliver business value.

Organizations shouldn’t expect high deployment rates at the model level, as it requires experimentation and iteration to translate business objectives into accurate models, useful dashboards, and productivity-improving AI-driven workflows. However, organizations that underperform in delivering business value from their portfolio of data science investments may reduce spending, seek alternative implementation methods, or fall behind their competitors.

While there are many technical and data-related reasons why analytics efforts fail or underperform, two recently published books suggest problems are more organizational and process-related. In Winning with Data Science, authors Friedman and Swaminathan suggest that business leaders must collaborate directly with data science teams and understand the machine learning development lifecycle. In The AI Playbook, author Siegel suggests that machine learning deployment is a “rare art” and that data science teams should start their efforts by establishing deployment and prediction goals.

I investigated the organizational and process issues that lead to underperformance. Here, I suggest what data science teams can improve while keeping in mind that deployment is not the end game. To drive ROI, growth, and efficiencies, data science teams must go beyond model deployment and ensure business teams use the analytics capabilities that are provided.

Why many analytics and machine learning efforts fail to deliver
We’ll look at the following reasons why analytics and machine learning efforts may not deliver business value as intended, and how teams can improve:

Analytics aren’t connected to end-user workflows
Inadequate collaboration between data scientists and developers
Insufficient attention to change management
Not learning from experiments
Analytics without automation or integration
Proofs of concept without production results
The leadership and talent skills gap

Analytics aren’t connected to end-user workflows

A key problem I hear from data science leaders is that teams must better understand how their models and analytics connect to end-user workflows. It’s harder to gain end-user adoption when a predictive model isn’t integrated or automated in the system of record where people make decisions.

“When designing and developing AI solutions, start with the user experience and learn how a recommendation, prediction, or automation will drive business impacts,” says Soumendra Mohanty, chief strategy officer at Tredence. In the analytics and data science space, it is important to interview the end users and learn what the problem is rather than throwing in front of them a bunch of dashboards disconnected from their business systems.”

Solution: Even when data science teams are only responsible for the models, the model development process should start with a defined vision statement for delivering value and how to work analytics solutions into, or in some cases disrupt, the existing business process.

Inadequate collaboration between data scientists and developers

To achieve an end-user-adopted production workflow, the steps beyond model deployment require collaboration between data scientists and software developers to perform integrations, application modifications, and workflow changes.

“A common leadership issue in productizing AI or ML efforts is not having a proper framework or cadence of interaction between data scientists and software developers,” says Rita Priori, CTO at Thirona. “When models are mature enough to be productized, data science and software development teams must come together early to align on the next steps and ensure developers are aware of how this work falls into their domain.”

A key question is when to bring software teams into the loop to understand the model and identify the required system changes. “Bringing software teams in too early in the experimental phase is impractical, and taking ideas too far down the path without input from the receiving team can be inefficient,” says Priori. “Communicating clear expectations on when data scientists and developers should transition the model is critical for delivering value through productizing efforts.”

Solution: One approach is to create flexible, agile data science teams that bring on different skill sets during the analytics lifecycle. In the early planning period, these teams may include Six Sigma and UX specialists to review existing workflows and contemplate workflow changes. As models begin to show promising results, the team can swap in software developers to plan the implementation, including application changes and integrations.

Insufficient attention to change management

Change management is another responsibility of multidisciplinary data science and development teams. Expecting end-users to embrace and adopt machine learning-enabled workflow improvements without a change management process is a mistake, especially when machine learning models change decision-making capabilities or are delivered with workflow automations.

Lux Narayan, CEO and co-founder of StreamAlive says a lack of alignment between tech and business teams and the absence of employee buy-in from the beginning of an initiative creates a barrier to adoption. “Employees and teams must be fully aligned on why innovation efforts are important so that they can understand how to achieve their goals. Team leaders and organizational heads should ensure streamlined communication, provide the technology for synchronous and asynchronous communication, and ensure regular alignment between business teams, IT teams, and analytics leaders who are truly digging into the implementations.”

Solution: Best practices start by including stakeholders and selected end-users in drafting the vision statement, reviewing what predictions are important, sharing their impacts on workflow, and defining success criteria. During the data science and development process, invite key stakeholders and end-users to sprint planning sessions to contribute to requirements and to sprint reviews to review ongoing progress.

Not learning from experiments

While data scientists understand the experimental and iterative nature of their work, they may not recognize that embedding models in user experiences and improving workflows must also be improved through iterations of releases and feedback capture. Even when the collective team deploys a minimally viable experience, they must interview end-users and stakeholders and learn what changes and improvements are needed.

“AI implementations can either become the great accelerator of positive user experiences or the bothersome feature forced into an application and onto users,” says Cody De Arkland, senior director of product incubation at LaunchDarkly. “Product leaders have to ensure that AI functionality is implemented to align with their end users’ desired workflow as an “additive” experience instead of a detractor. Employing experiments to ensure positive user sentiment helps you ship the AI capabilities quickly without risking losing customers because you forced a bad experience without data.”

Solution: Data science teams should consider a few techniques for solving this issue, including implementing A/B testing to measure the user impact of different implementations and qualitatively surveying end-users. Developers and data scientists should also ensure that applications and workflows are observable to capture and review performance issues, software defects, issues with the machine learning models, or usability issues.

Delivering analytics without automation or integration

Delivering more data, predictions, dashboards, and reports to the hands of end-users may be valuable, but there is a potential cost in productivity. Data science teams may create more work for employees when the deployed reporting tools are disconnected from the platforms used for workflows and decision making. A second concern is when analytics deliver new insights, but significant manual work is needed to take action on them. Automation and integrations should be priorities for the analytics delivery program.

“Analytics are meant to be consumed by humans, and they are the part that is not scalable,” says Vanja Josifovski, CEO and co-founder of Kumo. “Most key enterprise use cases, such as personalization and recommendations, are predictive. The road to unlocking the value here is to make machine learning easier and more automated, then expand to AI use cases.”

Solution: One approach to fusing a simple integration between dashboards and applications is through embedded analytics, where visuals from a data visualization or business intelligence tool are integrated into user interfaces with JavaScript or iframes. APIs offer richer integrations, as Ariel Katz, CEO of Sisense, noted in a recent blog post on a new era of embedded analytics. “Turning to APIs for embedded analytics would yield a much higher ROI by enabling developers to rapidly adapt to change and create new value for their users.”

Proofs of concept without production results

Product management, design thinking, and Six Sigma disciplines are important leadership roles in overseeing the collaboration of agile teams with business, data science, and software development as team members. However, even with collaboration, there remains a risk of running through too many proofs of concepts (POCs) and iteratively improving them without pushing a workflow with its analytical capabilities into production.

When these issues occur over a significant period of time, it may be a sign that the POCs are misaligned with the business’s strategic direction, or perhaps, leadership hasn’t defined their strategic analytics priorities.

“A top reason AI efforts fail is a lack of strategic direction, and there’s also POC purgatory, with most models never making it out of the lab,” says Hillary Ashton, chief product officer of Teradata. “One way to more effectively productize analytics capabilities is with reusable data products or curated sets of trusted data. Using data you can trust is key to creating trust in AI.”

Ashton’s recommendation is key for data-driven organizations, the most successful ones realizing that excelling at analytics requires having an implementation strategy that builds on itself. Creating reusable and extendable data sets, machine learning models, and visualization components is not only efficient, but it also helps data scientists deliver trustworthy and consistent analytics products and continue to improve them.

Solution: If a data science team is spinning the POC wheel without much to show for it, leadership should step in and provide guidance on priorities and promote workflow changes when models are production-ready. Paul Boynton, co-founder and COO of CsiBizInfo, says, “The true worth comes from leadership taking those insights and applying the findings to make concrete improvements within the organization.”

The leadership and talent skills gap

Organizations must grapple with whether they have the leadership talent and skills to keep up with the high-paced changes in AI capabilities, new technologies, and evolving business strategies. One issue is when organizations do not subscribe to a lifelong learning culture, and data science and technologists aren’t given enough opportunity to adopt modernized analytics approaches. Other times, the talented teams are overwhelmed with supporting legacy technologies and can’t focus on the new business, analytics, and technology opportunities.

“Analytics and AI efforts often fail to deliver value due to a lack of viable tech talent who can implement the right technology solutions at scale across an organization,” says Krishnan Venkata, chief client officer of LatentView Analytics. “Leaders can overcome this challenge by focusing hiring efforts on skilled data and analytics teams and technical talent, as well as outsourcing talent through alternative methods of hiring and offshoring.”

Technology and data science skill gaps can be addressed by hiring, training, and partnering. The other issue is whether the team has enough business acumen to relate analytics capabilities to business needs and opportunities. Greg Cucino of Trustwise says, “There can be a significant gap between the organization’s skilled personnel who understand the technology and also know how to apply it to real-world business challenges.”

There’s a debate as to whether AI, especially generative AI, is a new leadership discipline or whether it is an extension of data science and technology responsibilities. Kjell Carlsson, head of data science strategy and evangelism at Domino, says, “Companies that are successfully implementing AI projects have created AI leadership roles, built out multidisciplinary AI/ML teams, built processes that span the AI lifecycle, and invested in integrated AI platforms that streamline the development, operationalization, and governance of AI projects.”

Solution: While there is hype around AI, many companies have proven results from their data science investments. Organizations must set up their organizations for success. There are many data, technology, and governance considerations around AI and data science, but leaders should first look at people and process issues when analytics investments are underperforming. Organizations succeed by defining leadership roles, establishing priorities, driving multidisciplinary collaboration, and promoting learning activities.
https://www.infoworld.com/article/2515702/7-reasons-analytics-and-ml-fail-to-meet-business-objective...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Nov, Thu 21 - 14:48 CET