AI

Data as Business Fuel - From Excel to AI

2025-12-21
10 min
By Pawel Lipowczan
Data as Business Fuel - From Excel to AI

Data is the new oil. This statement has been circulating in the tech industry since 2006, when mathematician Clive Humby first popularized the comparison. And just like crude oil, data has no value on its own -- only after proper "refining" does it become fuel that powers business.

From my 15 years of experience as a developer and 4 years working with no-code, I know one thing: most companies are sitting on a gold mine but can't extract it. They have tons of data but lack the tools and processes to turn it into valuable insights.

And now, in 2025, we're on the brink of another revolution. AI agents like Claude Code, GitHub Copilot, and Cursor are changing the rules of the game -- what used to require months-long IT projects can now be built in days. Combining no-code agility with the power of traditional code backed by AI opens up entirely new possibilities.

Today I'll show you how to make that journey -- from chaos in spreadsheets to an organized data system that actually works for the business.

The Problem: Drowning in Data, Dying of Thirst for Information

180 Zettabytes of Chaos

Imagine 180 zettabytes of data. That's the amount of information that will be generated in 2025.

To put this in perspective: if we printed that data on A4 sheets and stacked them, we could fly to the Moon and back 60 times. Or we'd need 62 billion 16GB USB drives.

The amount of data doubles every 3 years. And it will likely double even faster due to the growth of AI and IoT.

The paradox is this: we have plenty of data but can't extract information from it. It's like sitting in front of a pantry full of ingredients but without a chef or a cookbook -- we can't make a pizza.

Four Key Challenges

From my experience working with dozens of companies, I see the same problems repeating like a mantra:

1. Data Silos

Customer data in one system, orders in another, invoices in a third. No integration. Want to see the full customer picture? You need to cobble together 3-4 different reports and manually merge them in Excel.

2. Poor Quality and Inconsistency

  • Duplicates (the same customer entered 5 times)
  • Typos (Smith, Smyth, Smiith)
  • Different formats (2025-12-21 vs 12/21/2025 vs 21.12.25)
  • Wrong data (someone entered a birth date instead of an order date)

Any analysis based on such data leads you astray.

3. Difficult Access

Data sits in SQL databases that regular employees can't access. Or they can access them but can't write a query. Every simple report requires a ticket to IT and a week of waiting.

4. Lack of Data Culture

Even when we give people tools, they don't know:

  • What questions to ask
  • How to interpret results
  • How to draw conclusions
  • How to implement those conclusions

Research from 2020 shows that despite having access to data, companies:

  • Lack a shared picture of the situation
  • Postpone decisions (because they're "not sure")
  • Waste precious time searching for information
  • Squander the economic and financial value of their data

The Path from Raw Material to Value

Before data can contribute anything valuable, it must go through several key stages:

  1. Acquisition - collecting data from various sources
  2. Integration - combining different sources into one coherent picture (this is the most important step!)
  3. Cleaning and transformation - without this, analyses are based on false premises
  4. Storage - in an analytical repository, accessible to the right people
  5. Analysis and visualization - dashboards, reports, charts
  6. Implementing insights - actually using information in decision-making

This is where business value finally appears.

Most companies get stuck at steps 2-3. They have data but can't merge and clean it well enough to serve as a basis for decisions.

Solution Part 1: No-Code and Data Democratization

Over the past 4 years, I've watched no-code transform how companies work with data.

Platforms like Airtable, Make, n8n, and Zapier radically lower the barrier to entry. People from business, marketing, and finance departments can independently:

  • Create data structures
  • Integrate different sources
  • Build simple reports and dashboards
  • Automate information flow

They don't have to wait weeks for the IT department. They don't need to know SQL, Python, or JavaScript.

Benefits of Data Democratization

Speed: A prototype system in Airtable can be built in hours, not months.

Cost: No need to engage expensive developers for simple reports.

Business proximity: The people who know the data best (because they use it) can build solutions themselves.

Engagement: Employees feel empowered -- they can solve their own problems.

But Democratization Requires Balance

Data democratization isn't just about technology. It's two conflicting values that must coexist:

Data literacy - user skills:

  • Effective use of data
  • Asking the right questions
  • Drawing sensible conclusions
  • Understanding limitations and errors

Data governance - data management:

  • Not everyone should have access to everything
  • There must be procedures and security frameworks
  • Quality and consistency must be maintained
  • Someone needs to make sure data doesn't get "broken"

You need to find a golden mean -- governance can't be a brake, but it must be agile enough to support users rather than block them.

Solution Part 2: AI as a Game-Changer

And this brings us to where we are now -- late 2025.

AI is changing absolutely everything when it comes to working with data.

What AI Already Delivers Today

1. Talking to Data in Natural Language

You don't need to know SQL. You type: "Show me the top 10 customers by order value this quarter" -- and you get an answer. This radically lowers the data literacy barrier.

2. Data Cleaning and Normalization

AI can find duplicates even with typos. It can standardize formats. It can fill in gaps based on context.

3. Pattern Discovery

In large datasets, AI will find correlations that humans would miss. For example: "customers who buy product X on Friday are more likely to return for product Y on Monday."

4. Generating Analyses and Forecasts

Based on historical data, AI can predict future trends, demand, and customer churn risk.

5. Report Automation

Instead of manually creating reports every week, AI can automatically generate summaries, charts, and insights.

But Also Risks

We need to be aware of the limitations:

  • Hallucinations - AI can invent facts that sound convincing
  • Black box - we don't always know where AI got its information
  • Non-determinism - it may give a different answer each time
  • Garbage in, garbage out - junk data will produce junk insights
  • Privacy - data can leak to models trained by third-party companies
  • Bias - even well-organized data can lead to wrong conclusions due to model biases

That's why AI is a tool, not a replacement for thinking.

The Breakthrough: AI Agents for Developers

And now the most important part: a new generation of AI tools for developers is changing the entire game.

For years we had a choice:

  • No-code: fast, cheap, but limited capabilities
  • Traditional code: unlimited capabilities, but slow and expensive

AI Agents Bridge These Two Worlds

Tools like:

  • Claude Code (which I'm using to write this article)
  • GitHub Copilot
  • Cursor
  • Windsurf

...give developers no-code agility + the power of traditional code.

What was previously only possible through no-code (rapid prototyping, iterative solution building), developers with AI assistance can now do just as quickly -- but with the full flexibility of code.

A Real-World Example

I recently built a system for analyzing data from multiple sources (Airtable, external APIs, CSV files). Previously:

  • In no-code: fast, but I couldn't handle complex transformations
  • In traditional code: 2-3 weeks of work

With Claude Code: 3 days.

AI helped me:

  • Quickly write API integration scripts
  • Generate data cleaning code
  • Build an ETL pipeline
  • Write tests
  • Optimize queries

I didn't need to memorize every library's syntax. I didn't need to search for solutions on Stack Overflow. I simply described what I wanted to achieve -- and AI generated code I could review, understand, and customize.

What This Means for Data Management

1. Faster Solution Deployment

Projects that used to take months now take weeks. This means faster ROI and the ability to experiment.

2. Lower Barriers to Entry - But With a Caveat

AI lowers the entry threshold, but it doesn't eliminate the need for experience. A junior developer with AI can do more, but an AI agent on its own is like a junior -- it can make mistakes, miss business context, and generate code with security gaps.

Experience is not a requirement, but it's strongly recommended. Someone experienced needs to oversee what AI generates -- verifying approaches, catching errors, assessing quality. AI is a powerful tool, but it requires supervision.

3. Better Code Quality

AI helps with writing tests, documentation, and optimization. Code is cleaner and easier to maintain.

4. Code vs No-Code - Depends on Your Team

The choice depends on the resources in your organization:

Have a senior developer? -> Go with code + AI. You'll get flexibility, no limitations, and full control.

No technical team? -> Stick with no-code (Airtable, Make, n8n). It's a far more accessible solution for people without programming experience.

A no-code + code hybrid only makes sense if you have someone technically strong on board. Then it's the best option: you keep the flexibility of code, the agility of prototyping, and practically no limitations that sometimes appear in no-code tools.

5. Talking to Data in Natural Language - But With the Power of SQL

You don't have to choose between simplicity and flexibility. AI can translate your question into a complex SQL query or Python script.

Case Study: From Chaos to Clarity

Theory is one thing, practice is another. Let me show you how we went through this transformation in our own company.

Context

22Ventures was a holding company made up of several firms: Automation House, Tigers, Huciao, Sowicki Legal. Automation House was created specifically to organize processes and data internally -- and then broadly bring those solutions to market.

We started where most companies do: chaos in spreadsheets.

Problem 1: Data Inconsistency

Every team had their own spreadsheets. The same data was entered differently:

  • Customer duplicates (the same customer 5 times in the database)
  • Typos (Smith, Smyth, Smiith)
  • Different date, amount, and name formats
  • Transcription errors

Solution: Normalization in Airtable

  • Creating dictionaries (lists of unique values)
  • Removing duplicates
  • Standardizing formats
  • Field-level validation (only specific values, formats)

Problem 2: Multiple Data Sources

Each department had its own spreadsheet:

  • HR - employee list
  • Finance - invoices and payments
  • Projects - tasks and timesheets
  • Sales - leads and customers

Data was duplicated across departments. No synchronization procedures. No organizational culture around data.

Solution: Migration to Airtable

Smooth migration from spreadsheets to Airtable. Consolidation into one workspace with proper bases and relations.

Example transformation: Team leader

In a spreadsheet: Every project had a full row of team leader data: first name, last name, email, phone, rate. Changing team leader data = manual changes in 20 places.

In Airtable: An employees base + a projects base connected by a relation. The team leader is a linked record. Change the data in one place -- it works everywhere automatically.

Problem 3: No Dictionaries

Data was entered freehand. Everyone typed it their own way.

Example: Employee benefits

In a spreadsheet: Every employee had a "Benefits" column with a full description and price. Changing the gym membership price = manual change for 50 employees.

In Airtable: A benefits base (dictionary) with prices. Employees have linked records to benefits. Change the price in one place -- it updates automatically for everyone.

Problem 4: Redundancy and Duplication

Spreadsheets naturally lead to data duplication. Every change requires manual propagation.

Solution: Single source of truth

In Airtable, every record exists in one place. Relations connect data without duplication. Change in one place = change everywhere.

Transformation Results

Before:

  • 15 different spreadsheets
  • 3-4 hours per week preparing reports
  • Data errors in every analysis
  • Team frustration
  • Decisions based on "gut feeling"

After:

  • One system in Airtable
  • Reports generated automatically
  • Clean and consistent data
  • Happy team (they have tools that work)
  • Data-driven decisions

And most importantly: This system evolves. We started at level 1-2, and now we're at 3-4 out of my 5 data maturity levels.

5 Levels of Data Maturity

Based on working with dozens of companies, I've developed a model for assessing where an organization stands in data management:

Level 1: Ad Hoc

  • Spreadsheets scattered across the company
  • First attempts at analysis (basic SUM, AVERAGE formulas)
  • Everyone does it their own way
  • No standards

Level 2: Consolidation

  • Combining data sources into one place
  • E.g., migrating from spreadsheets to Airtable
  • Everything in one system
  • Basic relationships between data

Level 3: Standardization

  • Data collection procedures
  • Processing standards
  • Data governance (who has access to what)
  • Dictionaries and validation
  • Data cleaning processes

Level 4: Optimization

  • Financial utilization of data
  • Creating data-based products
  • Monetization (selling data, insights)
  • Advanced analyses and forecasts

Level 5: Innovation

  • Fully data-driven culture
  • Every decision based on facts
  • Standardized processes across the company
  • Regular monetization
  • Continuous improvement of data quality and flow
  • Experimentation and hypothesis testing

Most companies are at level 1-2. Very few reach 4-5.

But with no-code and AI, this process can be accelerated many times over.

What You Can Do Today

Step 1: Assess Your Maturity

Ask yourself:

  • How many systems/spreadsheets do we store data in?
  • How long does it take to prepare a basic report?
  • How often do we find errors in our data?
  • Do we have procedures for data entry?
  • Do people trust our data?

This will help you determine which level you're at.

Step 2: Start with One Area

Don't try to fix everything at once. Pick one problem:

  • Maybe it's a customer list with duplicates
  • Maybe it's project chaos
  • Maybe it's the sales reporting process

Start with a small pilot project.

Step 3: Choose the Right Tool

For simple cases: Airtable, Notion, Google Sheets with Apps Script

For automation: Make, n8n, Zapier

For advanced analytics: Python + Pandas (with AI assistance), Power BI, Tableau

For no-code + code integration: Airtable + custom scripts built with Claude Code or Cursor

Step 4: Build a Prototype with AI

You don't need to be an expert. Use Claude, ChatGPT, or another LLM:

  • Describe your problem
  • Ask for a data structure proposal
  • Ask for integration/cleaning code
  • Iterate and refine

With AI, you can build a working prototype in hours, not weeks.

Step 5: Test, Learn, Iterate

Don't seek perfection right away. Deploy something simple, see how it works, gather feedback, improve.

Agility is key.

Key Takeaways

1. Data Without Refining Is Just Junk

It's not enough to have lots of data. You need to merge, clean, and standardize it -- only then does it have value.

2. No-Code Democratizes Access

Airtable, Make, n8n -- these tools give business people the power to build solutions without waiting for IT.

3. AI Changes the Rules of the Game

Talking to data in natural language, automatic cleaning, pattern discovery -- AI lowers the barrier to advanced analytics.

4. AI Agents Bridge No-Code and Traditional Code

Claude Code, GitHub Copilot, Cursor -- now developers can build as agilely as with no-code, but with the full power of code. It's the best of both worlds.

5. Democratization Requires Balance

Data literacy (skills) + data governance (management) must go hand in hand. Easy access without control is chaos. Control without access is a brake.

6. Start with a Small Step

Don't wait for a grand transformation project. Pick one problem, build a prototype, test, learn. Small wins build momentum.

7. Quality > Quantity

It's better to have 10 well-organized, clean data sources than 100 chaotic spreadsheets.

Next Steps

The world of data is changing at lightning speed. What was science fiction a year ago (AI generating code, talking to data in natural language) is now an everyday reality.

Companies that master the art of turning data into business value will be leaders in their industries.

And technology is no longer the barrier. No-code lowered the entry threshold. AI provided analytical power. AI agents gave developers agility.

Now the only barrier is the decision: start or wait.

From my experience, I know that companies that started a year ago have a competitive advantage today. Those that start today will have one a year from now.

And those that keep waiting... will drown in a sea of data, dying of thirst for information.

Need help organizing your company's data?

I can help you go from spreadsheet chaos to a coherent data management system. From auditing and consolidating sources through migration to Airtable/databases to automating reporting.

Book a free consultation

FAQ

What does "data is the new oil" actually mean in business practice?

Data, like oil, has no value on its own -- it must go through "refining" (cleaning, integration, analysis) to become fuel for business decisions. Most companies are sitting on a gold mine but can't extract it -- they have tons of data but lack the processes and tools to transform it into useful information.

What are the main obstacles preventing companies from leveraging their data?

Four key challenges: data silos (information scattered across different systems with no integration), poor quality (duplicates, typos, inconsistent formats), difficult access (needing IT for every report), and lack of data culture (people don't know what questions to ask or how to interpret results).

When should you choose no-code vs AI-assisted code for data management?

Have a senior developer on your team -- choose code with AI (Claude Code, Cursor) for full flexibility. No technical team -- stick with no-code (Airtable, Make, n8n). A hybrid approach only makes sense with strong technical support -- then you combine prototyping agility with practically no limitations.

How do you assess your company's data maturity level?

Five levels: (1) Ad hoc - scattered spreadsheets with no standards, (2) Consolidation - data in one system, (3) Standardization - procedures, governance, validation, (4) Optimization - monetization and advanced analytics, (5) Innovation - a fully data-driven culture. Most companies are stuck at level 1-2.

Where should you start with a data transformation in your company?

Pick one specific problem (e.g., customer duplicates, project chaos), build a small prototype in Airtable or with AI, test it with your team, gather feedback, and iterate. Don't seek perfection right away -- small wins build momentum. Trying to transform the entire company at once is a recipe for failure.

Tags:AIAutomationNo-CodeAirtableClaude