Companies struggle with the slow, unpredictable nature of AI projects
Despite significant investment in AI, many companies are still struggling to stabilize and scale their AI initiatives, according to Dotscience.
While 63.2% of businesses reported they are spending between $500,000 and $10 million on their AI efforts, 60.6% of respondents continue to experience a variety of operational challenges.
This is evidenced by the fact that 64.4% of organizations deploying AI said that it is taking between seven to 18 months to get their AI workloads from idea into production, illustrating the slow, unpredictable nature of AI projects today.
The State of Development and Operations of AI Applications 2019 report findings are based on a survey of 500 industry professionals.
The research examines the AI maturity of businesses based on the practical business applications of machine learning, the tools and processes being used to develop, deploy and monitor machine learning models and the scalability and stability of their AI initiatives.
What is driving the use of AI today?
AI has moved beyond the experimentation stage and is now seen as a critical and impactful function for many businesses. Dotscience’s research revealed that:
- Efficiency gains (47%), growth initiatives (45.6%) and digital transformation (43.8%) are the top three drivers for AI adoption.
- Over 88% of respondents at organizations where AI is in production indicated that AI has either been impactful or highly impactful to their company’s competitive advantage.
- Nearly a third of respondents (30.2%) are budgeting between one and ten million dollars for AI tools, platforms and services.
The study also found that despite this level of financial commitment, data science and ML teams continue to experience issues, including duplicating their work (33%), rewriting models after team members leave (27.6%), justifying the value of their projects to the wider business (27%), and slow and unpredictable AI projects (24.6%).
“With the amount of resources and money that organizations are spending on their AI initiatives, they cannot afford to make sacrifices when it comes to the productivity and efficiency of the teams responsible for realizing their AI ambitions,” said Luke Marsden founder and CEO at Dotscience.
“It is difficult to be productive when different team members cannot reproduce each other’s work. Reproducibility is key to enabling efficient collaboration and auditability. Many companies still rely on manual processes which discourage collaboration and make it difficult to scale and accelerate ML teams.”
Call to action: Data science and ML teams – do away with manual tracking!
Before the DevOps movement, practices such as version control and continuous integration were not commonplace––it was typical for software to take months to ship.
Today, companies can ship software changes in minutes. History is now repeating itself in the AI and ML sphere, with teams experiencing productivity and collaboration challenges analogous to those of the pre-DevOps software development era.
AI deployments today are slow and inefficient. Moreover, the manual tools and processes predominantly in use to operationalize ML and AI do not support the scaling and governance demanded of many AI initiatives. Results from the study indicate that:
- The top two ways that ML engineers or data scientists collaborate with each other are using a manually updated shared spreadsheet for metrics (44.4%) and sitting in the same office and working closely together (38%).
- Nearly 90% of respondents either manually track model provenance––a complete record of all the steps taken to create an AI model––or do not track provenance at all.
- Of those that manually track model provenance, more than half (52.4%) do their tracking in a spreadsheet or wiki.
“Manual processes are cumbersome, discourage collaboration and create knowledge silos within teams,” explained Marsden. “When model provenance is tracked manually, AI and ML teams often use spreadsheets without an effective way to record how their models were created. This is inflexible, risky, slow and complicated. To simplify, accelerate and control every stage of the AI model lifecycle, the same DevOps-like principles of collaboration, fast feedback and continuous delivery should be applied to AI.”
According to Gartner’s 2019 CIO Survey, the number of enterprises implementing AI grew 270% in the past four years and tripled in the past year.
While AI is increasingly in use throughout the modern enterprise, many organizations will be unable to realize the full potential of their deployments until they find faster and more efficient means of tracking data, code, models and metrics across the entire AI lifecycle.