MobileNews

Building MLGUI, user interfaces for machine learning applications

All the sessions from Transform 2021 are available on-demand now. Watch now.


Machine learning is eating the world, and spilling over to established disciplines in software, too. After MLOps, is the world ready to welcome MLGUI (Machine Learning Graphical User Interface)?

Philip Vollet is somewhat of a data science celebrity. As the senior data engineer with KPMG Germany, Vollet leads a small team of machine learning and data engineers building the integration layer for internal company data, with access standardization for internal and external stakeholders. Outside of KPMG, Vollet has built a tool chain to find, process, and share content on data science, machine learning, natural language processing, and open source using exactly those technologies.

While there are many social media influencers sharing perspectives on data science and machine learning, Vollet actually knows what he is talking about. While most focus on issues of model building and infrastructure scaling, Vollet also looks at the user view, or frameworks for building user interfaces for applications utilizing machine learning. We were intrigued to discuss with him how building these user interfaces is necessary to unlock AI’s true potential.

The lifecycle of machine learning projects

Vollet and his team build data and machine learning pipelines to analyze internal data and work on reports for KPMG’s management. They implement a layer enabling access to data and build applications to serve this goal. The first question to address when it comes to building user interfaces for machine learning applications is, are those applications different from traditional applications, and if yes, how?

Vollet finds that most of the time there is not much difference. The reason is he applies the same steps to develop a machine learning product that he also does for “regular” software development projects. Vollet also spoke about his method of approaching software development projects. The steps taken are as follows:

It starts with budgeting, and then people allocation. Based on the project’s budget, the project is staffed. Then the project has to be brought into KPMG’s DevOps environment. Consequently, sprints are planned, stakeholders are consulted, and the project’s implementation life cycle starts. Seen at this level of abstraction, every software project looks the same.

Continuous integration / continuous delivery is another good DevOps practice that Vollet’s team applies. What is different in projects that involve machine learning is that there are more artifacts to manage. Crucially, there are datasets and models, and evolution in both of those is very real: “It’s possible that today a model fits perfectly into our needs, but in six months we have to re-evaluate it,” Vollet said. MLOps, anyone?

So at which point does a user interface come into play in machine learning projects? The brief answer is, as soon as possible. Generally, Vollet considers having stakeholders in the loop as early as the first iteration, because they can familiarize themselves with the project and their feedback can be incorporated early on.

Having a good user interface is needed, because if we only show people code snippets, it’s too abstract, Vollet said: “With a Graphical User Interface, people can get an idea of what’s happening. Having an interface changes everything, because it’s easier for people to understand what’s happening. Most of the time, machine learning is really abstract. So we have an input, there’s a workflow, and then we have the end result. If you have a user interface, you can directly show the impact of what you are doing.”

Building user interfaces for machine learning applications

What are the key criteria to be considered when choosing a framework to build a user interface for machine learning applications? For Vollet’s team, the ability to run on premise, in KPMG’s own cloud, is the top priority. For many projects in KPMG, it’s a requirement.

Then comes charting. The different types of charts and diagrams that each user interface framework supports is one of the most important parameters. Then, it also has to be easy to use and to fit in their technology stack.

For Vollet, this means “something that the operations team can support.”  If it’s in the list of supported frameworks, there does not have to be an extra request and extra time both for the operations and the development team to familiarize themselves with the framework.

There are many tools they use, and they keep testing new ones. The market for frameworks to help build user interfaces for machine learning projects is growing. New players appear and old ones evolve. The big question is what are the frameworks of choice for Vollet, the ones his team usually works with.

Vollet’s default option is Streamlit, “because it’s super easy. You have features like a date picker. Also, you can have a front-end with a file upload, which business analysts can use as a front end to upload their Excel files or CSV, then do some adjustments.”

For something a bit more advanced, Vollet’s choice is Gradio: “It’s more focused for machine learning. There are so many features built into it in a short time. You can run it on Jupyter notebooks, or on Google Colab. It’s super-integrated and it’s cool, I highly recommend it.”

Plotly with Dash is another option Vollet thinks highly of. Dash’s promise is to enable users to build and deploy analytic web apps using Python, R, and Julia. No JavaScript or DevOps are required. Plotly is a framework built to leverage Dash. This one is more suitable for enterprises, as it needs infrastructure to run on, but it has good charting support, Vollet said.

Last but not least, there’s what Vollet called the new kid on the block, Panel. It’s a high-level application and dashboarding solution for Python. Panel works with visualizations from Bokeh, Matplotlib, HoloViews, and many other Python plotting libraries, making them instantly viewable either individually or when combined with interactive widgets that control them.

MLGUI: The art and science of developing GUIs for machine learning applications

Besides those open source frameworks, there were some additional honorable mentions by Vollet. One of those was Deepnote. Deepnote is not a user interface framework per se. Rather, it is touted as a new kind of data science notebook, Jupyter-compatible with real-time collaboration and running in the cloud. As notebooks also have visualization capabilities, it may be relevant too.

Another tool Vollet mentioned was Gooey. It’s the kind tool more used for having a user interface for a Python application, or script. It’s not so much a charting library people use for building a user interface for machine learning applications, although it can be used for that.

Integration seems to be centered around data science notebooks. When using Google Colab, for example, you can use Gradio and Plotly, so they are integrated in some sense, said Vollet. If you want full stack integration, then perhaps you are better off with Dash, he added.

Another interesting question is the degree to which those frameworks offer some flavor of MLOps support. If a new feature gets added to a machine learning model, would those frameworks be able to pick it up and use it, or would this have to be done manually? Gradio can do this, at least to some extent; in other frameworks, this would be a manual process, Vollet said.

Our takeaway is that MLGUI is another burgeoning domain adjacent to data science and machine learning. Like MLOps is the application of the DevOps principles and practices to the special needs that arise from developing machine learning at scale, we would argue MLGUI is the rise. It’s the otherwise well-known art and science of developing GUIs for applications, with the twist of applying it to applications utilizing machine learning. Even though that’s not a category in and of its own at this point, perhaps it should be.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member


Author: George Anadiotis
Source: Venturebeat

Related posts
AI & RoboticsNews

Mike Verdu of Netflix Games leads new generative AI initiative

AI & RoboticsNews

Google just gave its AI access to Search, hours before OpenAI launched ChatGPT Search

AI & RoboticsNews

Runway goes 3D with new AI video camera controls for Gen-3 Alpha Turbo

DefenseNews

Why the Defense Department needs a chief economist

Sign up for our Newsletter and
stay informed!