Loading...

You don’t need a GPT model to solve every enterprise problem: Mphasis’ Srikumar Ramanathan

You don’t need a GPT model to solve every enterprise problem: Mphasis’ Srikumar Ramanathan

With Indian information technology (IT) majors, like Tata Consultancy Services (TCS), Infosys, Wipro and Tech Mahindra, hopping on the artificial intelligence (AI) bandwagon and launching generative AI services, midcap IT services firm, Mphasis too didn’t want to be left behind. In an interview with TechCircle, Srikumar Ramanathan, Chief Solutions Officer at Mphasis, stressed that AI is the flavor of the season and explains that concepts like generative AI will continue to be a game changer for the technology industry. Edited excerpts.

How are you leveraging AI and what is your focus area?

If you look at our history, unlike most other global system integrators (GSIs), who were in the infrastructure space, we’ve used the Front2Back (F2B) approach to deliver value to businesses, while shrinking application cores and modernizing the ecosystem. We were ahead of the industry in leveraging a cloud and cognitive layer (a combination of AI and analytics and other forms of intelligence) for the customer – primarily banks – so that they can empower the front end, without really impacting the stability of the backend. So, while AI has been the flavor of the day, it's not something that we started working overnight, and as I said, we have always dabbled in large language models (LLMs). Nonetheless, the emergence of ChatGPT and other generative AI models have shown that every industry needs to respond to the changing dynamics and one needs to do that efficiently and differently. In fact, the time when ChatGPT was announced, we were in a good position to actually leverage our existing AI models. For example, since 2019, we were working on several design thinking workshops. We also wrote programs, which could actually scan those diagrams and convert them to HTML. So we've been doing things like that for a number of years. So that gave us a good foundation to build on digital transformation, and introduce AI in the appropriate fashion.

In June this year Mphasis launched its gen AI unit. How is the new unit helping enterprises to differentiate themselves in the fiercely competitive market?

The new division, known as Mphasis.ai, will offer guidance on the integration of generative AI solutions, create proprietary generative AI technologies, provide licenses to more than 250 AI models available on Hyperscaler marketplaces, and frameworks developed at Mphasis Next Labs, our in-house research and innovation lab, and collaborate with 50 startups to assist clients in solution development. Mphasis.ai will also supply clients with conversational AI tools, such as chatbots, to employ in their businesses.

What kind of companies have you partnered for this new initiative?

We have partnered with several Hyperscalers such as Amazon Web Services (AWS), Google Cloud Platform (GCP) Microsoft Azure. In fact, if you look at AWS Marketplace, Mphasis probably has the largest number of AI models available in the AWS Marketplace. We have also collaborated with specialized market-leading AI platforms and solutions companies such as Kore.ai and Databricks amongst others. Our partnership with Kore.ai, will enhance the integration, implementation, development of solutions, and engineering of top-notch AI solutions and in turn will also improve its contact centre offerings. 

What are the areas you need to think ahead when it comes to advancing gen AI?

Needless to say, generative AI is providing us with a lot of opportunities. But like every other technology advancement, it has got its own pitfalls. One area of concern is security, for which we have partnered with Securonix to tap its analytics-driven detection and automated response tools and Mphasis’ digital and cyber defense expertise, to provide future-ready cyber threat monitoring and response services to enterprises, and government agencies globally. Also, from the business perspective, our focus has been on how we get our 34,000 employees to be ready to harness AI quickly and efficiently. In order to train people in gen AI, we’re helping them form groups and giving them internal projects that have real-life implications. Our objective is to get 5000 odd people trained before the end of this financial year in using all of these products like Co Pilot, Code Whisperer, and more. Banking, insurance, healthcare, high-tech, travel and transportation, logistics are the primary verticals that we are tapping. We are also implementing a generative AI-based internal knowledge management system, where we respond to our customer queries in real time. So we have a three-pronged approach that we're taking to harness the power of AI.

Does the high operational cost associated with scaling gen AI models pose a challenge for you? 

While high operational costs associated with generative AI may hinder adoption, in our case it does not. We have used many of the open-source models, besides the LLMs we were already leveraging. And it is not necessarily that we need to have a GPT kind of model to solve every problem. Language models like BERT, for example, can be readily be fine-tuned and used for the appropriate tasks. We look at our customer’s problem, analyse and tell them which AI model you should be using, rather than just assuming that they need a large language model to solve their problems. That said AI adoption has become democratized. Earlier AI used to be a region that only special data scientists could do and therefore most CIOs found it a bit limiting. But with the API interfaces to all these models, it is much easier for you to integrate AI into many of the things that you already do and that indeed is making a difference.

Loading...

Sign up for Newsletter

Select your Newsletter frequency