The Challenge
Physical assets globally generate 41 Gigaton of CO2e emissions and 84 Zettabytes of data annually. Gartner estimates 68% of the asset data remains unused resulting in three key challenges:
Information loss every time there is asset transition, financial or operational
Information asymmetry between various stakeholders interacting with the asset
Opportunity cost of not optimizing asset value over its lifecycle
Over 10K companies directly or indirectly compete in this space, yet the problem is unsolved because:
The marginal cost & time to acquire and mine the asset’s dark data is high
They optimize either operational or financial or emission objectives individually, not in aggregate & simultaneously
Network effects are not considered & recommendations are moment-in-time v/s lifecycle based
The Opportunity
We estimate US companies are spending upwards of $35 billion annually to create visibility for this asset’s dark data but without clear RoI as a result of the following factors.
Fragmented Ecosystem: An average enterprise deploys an average of 130+ applications each targeting a specific aspect of the asset’s operational or financial performance.
With Connected Decisions: Users need to make decisions about an asset with data residing in multiple applications, that are interdependent but are not yet connected.
But Siloed Solutions: SaaS solutions are moment in time, most require structured data, don’t go across the asset lifecycle and optimize on one of the financial, operational or emission parameters.
Creating Linkage Burden: Enterprises build huge data teams or hire systems integrators to link, govern, centralize and visualize data for users to help decision making.
Current Solutions
There are four primary options for clients today that don’t scale and have their own limitations:
System Integrators to bring all the varied data sets together, but results in a custom solution with high ongoing maintenance costs
In-house data teams to build proprietary solutions for the enterprise that are very prescriptive and required long-term investment commitment from executives
SaaS point solutions focused on specific objectives and works great for that use case but don’t work well with other applications and optimize locally v/s achieving a global maxima
Building a fully functioning Digital Twin for each asset which takes time, has a hard ontology definition with high latency which prevents deployment in true critical environments
I am taking on the challenge of building an Agentic Asset Intelligence Platform that collects, organizes and acts all physical asset data securely to increase the life & value of the assets. My hunch is that the sweet spot on the marginal cost is <$15 cents per GB of dark data mined, where the ROI is higher than the cost.
Why is it the right time?
There is an increasingly growing share of connected & smart assets on a $430 trillion global physical assets balance sheet. By 2050 there will be over 100 billion IoT assets in the physical world generating over 230 exabytes of data everyday. I require four fundamental capabilities that have only coexisted recently.
Inference time reasoning on multi-modal GPT models to consume & contextualize unstructured data
A hands-free form factor that is extremely portable, and can run low latency AR applications with the combined power of computer vision
True agentic systems that can do multi-step reasoning for complex asset optimization objective functions
Immutable decentralized data ownership contracts that can help protect the data access based on individual users at scale
The cost of these technologies have dropped over 500% since launch with Moore’s scaling laws still applicable making them financially viable for a solution. Coupled with the growing size of the problem, makes this a perfect opportunity window.