Navigating Tomorrow: Unveiling the Latest AI Trends

Table Of Content
AI Startup Trends
-
Unsurprisingly, ~31% of startups ( 80 out of 269) have a self-reported AI tag . Take the actual number with a grain of salt but the trend is clear — startups leveraging AI are now a sizable part of the cohort.
-
Analyzed 20–25 startups from this batch to understand some of the larger trends, particularly among startups that are leveraging LLMs (large language models). The trends span across how they are identifying problems to solve, what approaches they are taking to the solutions, what they are doing right as well as potential risks in their approach.
-
But before we go into trends, let’s start with a general framework for how tech companies (small or large) can think about generating value from AI.
AI Value Chain
-
If you have been following tech news recently, there has been an explosion of content about AI, and it’s difficult to always make sense of where this news fits in the broad picture. Let’s use a simplified framework to think about it.
-
AI is a very broad term encompassing a wide range of technologies, from regression models that can predict things, to computer vision that can identify objects, to most recently LLMs (large language models). For the sake of this discussion, we will focus on LLMs which have been in the spotlight recently after OpenAI opened up ChatGPT to the public and started an AI race among companies.
-
Tech companies leveraging AI typically operate in one of three layers:
-
Infrastructure — This includes hardware providers (eg. NVIDIA that makes GPUs to support all the heavy computation required for AI models), compute providers (eg. Amazon AWS, Microsoft Azure, Google Cloud that provide processing power on the cloud), AI models / algorithms (eg. OpenAI, Anthropic that provide LLMs) , and AI platforms (eg. TensorFlow that provides a platform for training your models)
-
Data platform / tooling layer — This includes platforms that enable collecting, storing, and processing data for AI applications (eg. Snowflake that provides data warehouse in the cloud, Databricks that provides a unified analytics platform)
-
Application layer — This spans across all companies (startups, mid-large tech companies as well as not-natively-tech companies) that are leveraging AI for specific applications
-
-
Based on where the market currently is and how this has played out in similar situations in the past (eg. the cloud computing market), the Infrastructure and Data Platform layer will likely converge to a handful of players with relatively commoditized offerings. For example:
-
Among hardware players, NVIDIA is currently the leader with their GPU offerings (their stock tripled in 2024) and we’ll have to see who else catches up
-
The compute market has already converged, with AWS, Azure and Google Cloud owning two-thirds of the market
-
In the AI algorithms layer, OpenAI came out strong with the GPT models but it is a highly competitive market with deep-pocketed players (Google with Deepmind/Google Brain, Facebook Lambda, Anthropic, Stability AI) — see this analysis if you want a deeper take. Two things to note here: i. Most of these companies have access to the same data sets, and if one company does get access to a new paid data set (eg. Reddit), it is likely that the competitors will get access to that as well, ii. The GPT model sits at the algorithms layer but the ChatGPT product sits at the application layer (not at the algorithms layer).
-
-
Given this likely path to commoditization, the companies operating in these layers have two possible paths they can pursue:
-
First path is beefing up their offerings to operate across layers, as evidenced by recent M&A activity — Snowflake (data warehousing company in the data platform layer) recently acquired Neeva to strengthen their search capability as well as potentially unlock application of LLMs for enterprises, Databricks (analytics platform in the data platform layer) acquired MosaicML (in the AI algorithms layer) to make “generative AI accessible for every organization, enabling them to build, own and secure generative AI models with their own data”
-
Second path is moving up to the application layer — ChatGPT is a classic example. OpenAI’s strength was in the AI algorithms layer but with the launch of a consumer product, they are now the first real competitor to Google Search in decades.
-
A majority of future value unlocked from AI and LLMs will be at the application layer
AI startup trends
- Focus on specific problems and customers
- Integrations with existing software
- Leveraging LLMs in conjunction with other AI technologies
- Customization of LLMs
- Creative user interfaces
- High information volume, high precision use cases
7. Data silo-ed, BYOD products for enterprise customers
- They want to operate in a construct where they can bring their own data (BYOD) to a baseline product, and customize the product in a silo-ed environment.
Moat Risks
-
It’s definitely exciting to see a large number of AI startups emerge, which helps both individual consumers and organizations be more effective. These products undoubtedly will be a huge unlock for productivity and effectiveness in solving problems.
-
However, a key risk with several of these startups is the potential lack of a long-term moat. It is difficult to read too much into it given the stage of these startups and the limited public information available but it’s not difficult to poke holes at their long term defensibility. For example:
-
If a startup is built on the premise of taking base LLMs like GPT, building integrations into helpdesk software to understand knowledge base & writing style, and then generating draft responses, what’s stopping a helpdesk software giant (think Zendesk, Salesforce) from copying this feature and making it available as part of their product suite?
-
If a startup is building a cool interface for a text editor that helps with content generation, what’s stopping Google Docs (that is already experimenting with auto-drafting) and Microsoft word (that is already experimenting with Copilot tools) to copy that? One step further, what’s stopping them from providing a 25% worse product and giving it away for free with an existing product suite (eg. Microsoft Teams taking over Slack’s market share)?
-
-
The companies that don’t have a moat could still be successful in their current form, and the nature of what they do makes them attractive acquisition targets, both from a feature add-on and from a talent perspective. However, building a moat would be critical for startups that are interested in turning these early ideas into huge successes.
-
One clear approach is building a full product that solves for a problem space and is heavily using AI as part of their feature (vs an AI-only product that’s an add-on on top of an existing problem space).
-
Another approach is beefing up the product offering (going from an AI feature to a broader product for the problem space) by leveraging some of the above trends in tandem with each other — data integrations, BYOD models, enabling customizations, combining with other AI technologies.