Why Data Science Alone Won’t Make Your Product Successful

Why Data Science Alone Won’t Make Your Product Successful


The divide between tech and business teams has narrowed to almost nothing over the past decade. And I’m all for it. Not every tech team works at a tech company, and blurring the lines between business and tech means we are able to build and ship a product with the confidence that it would be well-received, widely adopted (which isn’t all the time a given), and make a significant contribution to the bottom line. Tell me a higher option to motivate a high-performing tech team, and I’ll listen.

It’s a shift that has been accelerated—if not caused—by data technology. For a long time, we’ve been working through the hype cycles of massive data, business intelligence, and artificial intelligence. Each has introduced latest skills, problems, and colleagues for the CTO and his team to handle, and each has moved us a little further from the remainder of the organization; no one else can do what we do, but everyone needs it.

- Advertisement -

Technical teams are not inherently business, and as these roles have expanded to incorporate building and delivering tools that support different teams across the organization, this gap has grow to be increasingly apparent. We’ve all seen the statistics about how many data science projects, in particular, never get implemented—and it’s no wonder why. Tools created for business teams by individuals who don’t fully understand their needs, goals, or processes will all the time have limited utility.

This waste of technology dollars made perfect sense in the early days of AI—investors desired to see technology investments, not outcomes—but the technology has matured and the market has modified. Now we want to indicate real returns on our technology investments, which suggests delivering innovations that have measurable impact on the bottom line.

Moving from support to basic function

The growing pains of the data hype cycle have brought two incredible advantages to today’s CTO and his team (apart from the introduction of tools like machine learning (ML) and AI). The first is a mature, centralized data architecture that breaks down historical data silos across the business and gives us a clear picture—for the first time—of exactly what’s happening commercially and how one team’s actions impact one other. The second is the shift from a support function to a core function.

The latter is vital. As a core function, technical staff now have a seat at the table alongside their business colleagues, and these relationships help drive a higher understanding of processes beyond the technology team, including what these colleagues need to perform and the way it impacts the business.

This in turn has given rise to latest ways of working. For the first time, technical people are not clustered, fielding disconnected requests from across the business to tug these statistics or process this data. Instead, they’ll finally see the impact they are having on the business in monetary terms. It’s a rewarding perspective that has given rise to a latest way of working; an approach that maximizes that contribution and goals to generate as much value as possible as quickly as possible.

Introducing the Value of Lean

I hesitate so as to add one other project management methodology to the lexicon, but lean-value is price considering, especially in an environment where the return on technology investment is so closely scrutinized. The tenet is “ruthlessly prioritize to maximize value.” For my team, which means prioritizing the work that has the highest probability of delivering value or meeting the organization’s goals. It also means deprioritizing noncritical work.

We focus on achieving a minimum viable product (MVP), applying lean engineering and architecture principles, and—here’s the tricky part—actively avoiding a perfect build in the initial pass. Every week, we review non-functional requirements and prioritize them based on our goals. This approach reduces unnecessary code and prevents teams from getting off-topic or losing sight of the larger picture. It’s a way of working that we’ve also found to be inclusive for neurodivergent people on the team, because there’s a very clear structure to latch onto.

The result was accelerated product deployment. We have a distributed, international team and operate a modular microservice architecture that lends itself well to a lean-value approach. Weekly reviews keep us focused and prevent unnecessary development—which in itself is a time saver—while also allowing us to make incremental changes, thereby avoiding extensive redesigns.

Using LLM to enhance quality and speed up delivery

We set quality levels we want to realize, but in favor of efficiency over perfection, we are pragmatic about using tools like AI-generated code. GPT 4o can save us time and money by making recommendations about architecture and features. Our senior staff then spends their time critically evaluating and refining those recommendations slightly than writing code from scratch.

There can be many who find this approach daunting or shortsighted, but we take care to mitigate risk. Each build increment have to be production-ready, polished, and approved before we move on to the next. There is never a stage where people are out of the loop. All code—especially generated code—is overseen and approved by experienced team members in accordance with our own ethical and technical codes of conduct.

Data lakehouse: lean data architecture of value

Inevitably, the Lean-Value framework spread to other areas of our process, and the adoption of enormous language models (LLM) as a time-saving tool led us to data lakehousing, a portmanteau of the words data lake and data warehouse.

Data standardization and structuring unstructured data to deliver an enterprise data warehouse (EDW) is a multi-year process and has its drawbacks. EDWs are rigid, expensive, and have limited utility for unstructured data or diverse data formats.

While a data lakehouse can store each structured and unstructured data, using an LLM to process it reduces the time it takes to standardize and structure the data and robotically transforms it into invaluable insights. The lakehouse provides a single data management platform that may support each analytical and ML workflows and requires fewer resources from the team to establish and manage. Combining an LLM and data lakehouse accelerates time to value, lowers costs, and maximizes return on investment (ROI).

Similar to the lean-value approach to product development, this lean-value approach to data architecture requires certain safeguards. Teams must have robust and well-considered data governance in place to take care of quality, security, and compliance. Balancing query performance of enormous data sets while maintaining cost-effectiveness is also an ongoing challenge that requires continuous performance optimization.

A Place at the Table

Lean-value is a framework that has the potential to rework how technology teams integrate AI insights into strategic planning. It enables us to deliver meaningful solutions for our organizations, motivates high-performing teams, and ensures they are leveraged to their full potential. Critically for the CTO, it ensures that the ROI on technology is clear and measurable, creating a culture where technology sets business goals and contributes to revenue just as much as departments like sales or marketing.

Data decision makers

Welcome to the VentureBeat community!

DataDecisionMakers is a platform where experts, including technical data scientists, can share data-related insights and innovations.

If you must learn about the latest ideas and insights, best practices, and the future of information and data technology, join us at DataDecisionMakers.

You may even consider writing your individual article!

Latest Posts

Advertisement

More from this stream

Recomended