Join Transform 2021 this July 12-16. Register for the AI event of the year.
Open source stacks enabled software to eat the world. Now, several innovative companies are working to build a similar open source software stack for AI development.
Dan Jeffries was there when the LAMP stack kicked this off.
LAMP is an acronym representing the key technologies first used in open source software development — Linux, Apache, MySQL, and PHP. These technologies were once hotly debated, but today they are so successful that the LAMP stack has become ubiquitous, invisible, and boring.
AI, on the other hand, is hotter than ever. Much as the LAMP stack turned software development into a commodity and made it a bit boring, especially if you’re not a professional developer, a successful AI software stack should turn AI into a commodity — and even make it a little boring. That is precisely what Jeffries is setting out to do with the AI Infrastructure Alliance (AIIA).
Innovation, and where it’s at
Jeffries wears many hats. His main role, in theory at least, is chief technical evangelist at data science platform Pachyderm. Jeffries also describes himself as an author, futurist, engineer, systems architect, public speaker, and pro blogger. It’s the confluence of all those things that led him to start the AIIA.
The AI Infrastructure Alliance’s mission is to bring together the tools data scientists and data engineers need to build a robust, scalable, end-to-end, enterprise artificial intelligence and machine learning (AI/ML) platform.
This sounds like such an obvious goal — one that would be so beneficial to so many — you’d think somebody must have done it already. But asking why we’re not there yet is the first step toward actually getting there.
Vendor lock-in is a reason, but not the only one. Vendor lock-in, after all, is becoming increasingly less relevant in a cloud-first, open source-first world — although technology business moats live on in different ways.
Jeffries was surprised that he did not see an organization actually trying to capture the energy around AI activity, bring different companies together, and get their integrations teams talking to each other:
“Every founder and every engineer I ended up talking to was very excited. I really didn’t have to work very hard to get people interested in the concept. They understood it intuitively, and they realized that the innovation is coming from these small to mid-sized companies,” Jeffries said.
“They are getting funded now, and they’re up against giant vertically integrated players like SageMaker from Amazon. But I don’t think any of the innovation is coming from that space.”
Having spent more than 11 out of his 20-year career at Red Hat, Jeffries recalls how the proprietary software companies used to come up with all the ideas and then open source would copy them “in a kind of OK way.”
But over time, most of the innovation started flowing to open source and to the smaller companies’ projects, he said.
An open source AI stack for the future
The Amazons of the world have their place — the cloud is where most AI workloads run. Big vertically integrated proprietary systems serve their own purpose, and they’re always going to make money. But the difference is Kubernetes and Docker don’t become Kubernetes and Docker if they only run on Google, Jeffries said.
Innovation is going to come from a bunch of these companies working together like little Lego pieces that we stack together, he added.
That’s precisely what the AIIA is working on.
So, when can we expect to have a LAMP stack for AI? Not very soon, probably, which brings us to the other key reason this has not happened yet.
Jeffries expects a LAMP stack, or a MEAN stack, for AI and ML to emerge in the next five to 10 years and to change over time.
The LAMP stack itself is kind of passé now. In fact, the cool dev kids these days are all about the MEAN stack, which includes MongoDB, ExpressJS, AngularJS, and NodeJS.
He has described these as canonical stacks, which arise with greater and greater frequency “as organizations look to solve the same super challenging problems.”
The kind of momentum that happened with LAMP will occur in the ML space, Jeffries suggested. But he warned against believing the hype that anyone has an end-to-end ML system at this point. This can’t be true because the sector is just moving too fast. The space itself and the problems to solve are shifting as the software is being created.
That makes sense, but then the question is — what exactly is the AIIA doing at this point? And what does the fact that its ranks include some of the most innovative startups in this space, alongside the likes of Canonical and NewRelic, actually mean?
Now some innovators are working to build an open source stack specifically for AI. Enthusiasm is good, but there’s a gap between saying “Hey, that sounds like a good idea, sign me up” and actually coming up with a plan to make it happen. So how are the AIIA and Jeffries going to pull it off?
As a writer, Jeffries used George R.R. Martin’s metaphor of gardeners and architects to explain how he sees the evolution of AIIA over time. Architects plan and execute; gardeners plant seeds and nurture them.
Jeffries identifies as a gardener and sees a lot of the people in the organization as gardeners. He thinks it’s necessary at this phase and envisions the AIIA evolving over time.
Right now, the idea is to get people talking at a lot of different levels, rather than working in their own little silos. Micro-alliances are fair game though: “If you look at 30 logos on the website, you’re not going to build an ML stack with all 30 of those things,” Jeffries said.
Another concern is the fact that building bridges, and communities, takes time and energy. But Jeffries is enthusiastic about the prospect of helping shape what he sees as the AI revolution, is inspired by the open source ethos, and has the leeway from Pachyderm to run with his ideas.
Work, boring work, and AI
That seems to be what he’s doing with AIIA. Currently, he’s working on turning the AIIA into a foundation, and he’s also in talks with the Linux Foundation. The goal is to get to the point of bringing in some revenue. Jeffries is working on finances and a governance structure for the AIIA.
“You get people who are just firmly focused on this, and it becomes a balance of volunteer efforts and people paid to work on different aspects. The next step really is a lot of logistical work — the boring stuff,” Jeffries said.
Another metaphor Jeffries used is that of a strategic board game, where you have to think about everything that can go wrong in advance — a bit like a reverse architect. Inevitably, there is going to be at least some amount of boring work, and somebody needs to do it. But for Jeffries, it’s all worth it.
“When I look at AI at this point, I think very few people understand just how important it’s going to be. And I think they have an inkling of it, but it’s usually a fear-based kind of thing,” he said. “They don’t understand fully that in the future, there are two kinds of jobs: one done by AI, and one assisted by AI.”
Isn’t it actually three types of jobs, as someone has to build the AI? The people building AI are going to be assisted by AI, so that falls into the second category, Jeffries said. There’s a creative aspect, as someone has to come up with an algorithm. But things like hyper-parameter tuning are already being automated, he added.
Jeffries waxed poetic about how “the boring stuff” will be done by AI so people can move up the stack and do more interesting things. Even the creative parts will be a co-creative process between people and AI, in Jeffries’ view.
As for the “AI destroys all the jobs” narrative, we’ve heard this one before but the previous industrial revolutions worked out fine, Jeffries argued. Same goes for the argument that the pace of innovation is so rapid that we don’t have time to create jobs to replace those that are going to be displaced.
What even an AI optimist like Jeffries can’t easily dismiss is the fact that innovation may not necessarily be coming from the FAANGs, but this is where the data is. This creates a reinforcement loop, where more data begets more AI leading to more data, and so on.
Jeffries acknowledges data as a legitimate moat. But he believes ML is progressing in ways that make the dependency on data less vital, such as few-shot learning and transfer learning. This, and the fact that the amount of data the world is creating is not getting any smaller, may spur change.
What seems inevitable, however, is the need to do lots of work, often boring, to be able to chase dreams of creativity.
VentureBeat
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.
Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more
Author: George Anadiotis
Source: Venturebeat