Top

Why companies are following the MaaS

Applications based on artificial intelligence may be more or less useful for end users; they have certainly become straightforward to develop thanks to what is referred to as MaaS or Model-as-a-Service. This is a mode that provides pre-trained basic AI models as a service to power applications. Most developers wish to avoid the training, hosting and management of artificial intelligence models to power their apps, as the implementation difficulties could lead to an application idea’s abandonment. MaaS makes it easy to use machine learning models already trained for application requirements without the costs associated with model development. It allows start-ups to concentrate on their core applications rather than developing models independently.

Lower costs and benefits for users

Training and implementing artificial intelligence models is a different ball game than selecting from those already available, such as Microsoft Phi-3, Google’s Gemini, OpenAI’s ChatGPT, Meta’s Llama 3 and others. MaaS also differs from IaaS (infrastructure as a service), where the service is the infrastructure, not the code. At most, we can compare it to cloud offerings, the difference being that the Model-as-Service provides products with complex machine learning capabilities instead of hosting and a little more.

In the beginning was Microsoft

Microsoft introduced its MaaS program in 2023, initially only with Meta’s Llama 2 and Mistralv7B, later expanding the catalogue with Nixtila’s Cor42 JAIS and TimeGen 1, Stability AI and Cohere. However, it is not just Microsoft: Alibaba Cloud, the technology backbone of Alibaba Group Holdings, offers MaaS services through its ModelScope platform for various open-source models. It was launched a year ago and has 300 ready-to-use artificial intelligence models for developers and researchers. ModelScope’s offerings cover a range of artificial intelligence models, from NLP (natural language processing) to computer vision. Tencent, the world’s largest video game publisher, launched a MaaS platform last year.

Its Chief Executive Officer and Senior Executive Vice President, Dowson Tong, at the World Conference on Artificial Intelligence in Shanghai, said: “By leveraging these capacity models, partners can easily create their own unique models by adding their own unique data. Through privatized deployment, authority control and data encryption, we prioritise data protection for business users, ensuring secure and reliable use of the model.” Tencent Cloud MaaS offers pre-trained artificial intelligence models for various business sectors, including media, finance, healthcare and education.

Less implementation, goodbye DevOps?

A few days ago, Microsoft announced MaaS within its Azure AI Studio, aimed at simplifying the implementation of artificial intelligence models for developers. This service offers a simplified approach to circumvent the complexities of implementing AI models. With access to a curated catalogue of AI models, developers can easily activate and use models, significantly reducing technical barriers. Although it is theoretically possible to replace some of the most basic computer coding with artificial intelligence, higher levels of human expertise still hold sway and probably will for an indefinite period. Indeed, AI can help engineers save time, just as computers and specialized software help accountants, managers, and other respected professionals. However, DevOps consists of two parts: Dev + Ops. And the Ops part?

In a classic definition, Ops specialists generally work closer to industry applications and IT infrastructures. Their work involves hardware repairs and upgrades, coordination with other departments, quality control specialists, and others to increase efficiency and establish best practices. It is important to bear in mind that IT operations in the context of DevOps are not the same as traditional IT operations. DevOps has no traditional Ops role per se, so both Dev and Site Reliability Engineers (SREs) may share operational responsibilities. With the emergence of this trend, DevOps specialists have become highly skilled in performing operational tasks using new tools and procedures, such as Continuous Integration/Continuous Delivery (CI/CD), which traditional operations did not use.

MaaS
Why companies are following the Model-as-a-Service

With the current level of diagnostic hardware and software used by modern industrial machinery and technological infrastructure, artificial intelligence could analyze the data and suggest (perhaps even perform) some minor adjustments. But as we move up the skill ladder, we encounter the same situation as Dev. At present, artificial intelligence encounters complex problems, questions that require unconventional solutions and, most importantly, in defining business goals and priorities.

Expanding the AI model library

To date, the MaaS offering boasts over 1,600 AI models, covering a wide range of functionalities. Recent additions to this library include Nixtila’s TimeGen-1 and Core42 JAIS, with further expansions planned, including AI21, Bria AI, Gretel Labs, NTT Data, Stability AI and Cohere. The MaaS framework is designed to be highly inclusive, allowing developers to use artificial intelligence models for inference and tuning on a pay-as-you-go basis. This eliminates the need for direct interaction with the underlying hardware or extensive configuration, making the artificial intelligence implementation process more accessible. Seth Juarez, Microsoft’s principal program manager for the AI platform, points out that this service eliminates the intricate implementation details so that developers can focus on the creative aspects of their projects. Although the MaaS model is designed to be highly flexible, Microsoft recognizes that some specialized or unique models may not fit this framework due to their specific requirements.

Finding the LLM for every need

When developers have a particular job that artificial intelligence can solve, it is usually not as simple as targeting an LLM on data. There are other considerations, such as cost, speed and accuracy, and finding ways to balance all of these was particularly challenging, especially with so many new models available. This is where Unify, a British start-up, comes in, which has come up with a tool to input parameters and find the best LLM for specific needs.

The main objective of Unify is to understand which models and suppliers are best for a company using objective benchmarks and dashboards. The term most used in this case is ‘router’, a kind of neural network that learns which models are best for performing certain tasks. The router has background knowledge, exhaustive benchmarks on every new model that comes out on the market, using GPT Pro as a ‘judge’. The idea is to have a fair scale that advises on the available models without preference and with extreme fairness. This is something that today’s scenario, with a rapidly changing business, could lose, leaving room for a small oligarchy of AI suppliers.

Antonino Caffo has been involved in journalism, particularly technology, for fifteen years. He is interested in topics related to the world of IT security but also consumer electronics. Antonino writes for the most important Italian generalist and trade publications. You can see him, sometimes, on television explaining how technology works, which is not as trivial for everyone as it seems.