bdougie
5 mins read
Last year Nat Friedman and Daniel Gross kicked off the AI Grant and have incubated 60 companies with the pre-seed and seed funding.
AI Grant Announcement on Twitter
Applications are open once again and I was curious to see what open source projects benefited from participating in the previous batches. My thesis is that open source provides an organic opportunity for growth and traction for AI startups.
This list was put together by tracking all public GitHub organizations that took AI Grant funding on OpenSauced. There I was able to quickly see project activity and which projects have an open source developer community alongside that matches up with the traction and engagement.
The OpenSauced AI Grant recipient list is public and viewable here.
You know, Rowy is open-source, and it's not just for developers. I always tell this story when I start working with new engineering teams. I once built an entire database for my employer's marketing site using Google Docs and a bunch of code. Only later did I discover that there were companies offering this as a service - they're called CMSs! Thankfully, I survived my over-engineering days, and now there's an even better solution: using a spreadsheet-like interface to manage a database without even realizing it.
What's cool about Rowy is that it opens up your database to the rest of your team since they probably already know how to use a spreadsheet. Low-code tools are getting more popular by the day, and Rowy stands out by using GPT-3 to generate projects and workflows.
Not too long ago, Rowy launched BuildShip, which takes things up a notch by helping users create workflows for no-code tools. You might think that no-code tools come with limitations and lock-ins, but BuildShip is here to change that. It's compatible with over a million npm packages and offers one-click deployment options to various cloud platforms like AWS and Azure.
Rowy has been around for a few years and it looks like the majority of the recent contributions still mostly come from their cofounders (marked in OpenSauced as maintainers), but it seems like they have decent traction with over 5k stars and almost 500 forks.
While searching through their GitHub repo, I saw they engage the community by allowing them to upvote items on the roadmap. This seems like a clever way to build a highly engaged userbase, and it shows.
Stats from the repo: ⭐ 5.5k 👀 58 Forks: 457 License: Apache-2.0
Replicate allows you to run machine learning models using a cloud API without needing to be an expert in machine learning or infrastructure management. You can run open-source models published by others, or package and publish your own, either publicly or privately. The great part is that many of these models are open-source on GitHub, making it an excellent resource for learning.
Think of cogs as GitHub repositories that you host on Replicate for private or public use in your projects. Managing and hosting ML models can take several paths, most of which require a strong background in MLOps, which can be intimidating for beginners.
With Replicate, you can explore hosted cogs and easily integrate them into your project with just a few lines of code. I highly recommend browsing through some hosted models for inspiration for your AI side project.
Explore some hosted replicate cogs
A number of the example cogs are made from Zeke the founding Designer at Replicate. Thanks to cog model, the approach towards contribution to the Replicate open source project is really approachable and will be a driving force for more users adopting shared models on their platform.
The secret sauce for a lot of open source is making the project extensible and cogs are proving that model. Their community is growing fast and onboarding new hosted cogs every day, some with up to 100k runs.
https://github.com/replicate/cog
Stats from the repo: ⭐ 6.4k 👀 62 Forks: 434 License: Apache-2.0
Every time we've experienced a major shift in computing - from mainframes to PCs, the web, mobile devices, and the cloud - we've seen the birth of a new software stack. In this latest stack, AI takes charge of the application logic layer. It's programmed using natural language, and it's incredibly flexible. But there's more to the story than just logic - we can't forget about memory. To handle the memory, storage, and state layers, we need a completely new approach: AI programmable memory. This goes beyond simply storing and fetching information; it actually determines what info is accessible to the AI. This is crucial for making AI systems reliable, controllable, easy to understand, and safe.
Enter Vector Databases: these specialized databases are designed to efficiently store, manage, and work with high-dimensional vector data. They're not like your traditional databases with tables or documents - instead, Vector Databases use mathematical representations of data points in multi-dimensional space. This makes searching and retrieving information faster and more accurate, especially when dealing with huge data sets or complex queries.
By taking advantage of advanced indexing and search algorithms, Vector Databases boost the performance and scalability of AI applications. They're an essential ingredient for developing next-gen AI systems, and that's where Chroma comes in. It's a Vector Database that lets users create LLMs (Language Learning Models) to power intuitive AI experiences for their users.
I recently sat down with one of the Chroma founders to talk about their unique position in the space and how they stand out from other solutions. https://www.youtube.com/watch?v=BqKUOKMkdic&t=34s
Their secret sauce seems to be making vector searching and building knowledge bases accessible. Not only do they work in Python, a well known language for AI/ML, they also have JavaScript and Rust SDKs support by the community.
Stats from the repo: ⭐ 10.8k 👀 73 Forks: 882 License: Apache-2.0
Gaining traction for startups can be even more challenging than building the project itself. Securing funding and exposure creates a win-win situation, and these projects have all benefited from being accessible to the open-source community. My thesis on open-source is validated as AI startups gain significant traction by not only making their projects open but also extending them to the community.
If you are considering building or launching an AI project in 2024, I recommend considering submitting for the AI Grant before the deadline, February 16th. I wasn’t able to cover all open source projects that received a grant, but here a few more worth checking out:
And if you like to see the longer list of previous open source submissions checkout oss.fyi/aigrant.
Chief Sauce Officer for OpenSauced.
Recent Posts
#OSCR
BekahHW
7 mins read
Unlock the full potential of your open source community by understanding and valuing well-rounded contributors. Learn how the OSCR can highlight unsee...
#OSCR
BekahHW
6 mins read
Explore the dual power of metrics and storytelling in open source contribution. Learn how tracking numbers and crafting narratives together can help y...