Back icon Gurtam news / How to build AI into operations. flespi experience

How to build AI into operations. flespi experience

Date icon 17 July, 2024
How to build AI into operations. flespi experience
Follow us

AI is rapidly transforming the landscape of business and technology. As more companies adopt AI, their operational scopes expand, and their approaches to daily tasks change. 

At Gurtam, we are embracing AI across our organization and product teams. In our upcoming series of articles, we will explore our journey with AI, detailing our experiences and offering insights on its implementation.


AI, LLM, and RAG, in simple words

AI technology for text processing reminds us of human cognitive abilities, applying similar logic to tasks we perform daily. For instance, handling a corporate email involves reading, analyzing, sometimes taking a cup of coffee, consulting additional sources, and formulating a response that reflects one’s knowledge, corporate culture, and personal style. 

In AI-powered text processing, we use advanced algorithms called large language models (LLMs) like ChatGPT, which learn from vast data sets to generate textual output quickly. Enhancing these capabilities, Retrieval-Augmented Generation (RAG) integrates a retrieval component that allows LLMs to access external information in real-time, enriching responses with context-specific details. It's like an AI doing research while generating responses.

SchemeAI-powered text processing using the RAG framework


All the tasks united by the pattern

To understand how AI integrates into flespi operations, we spoke with Jan Bartnitsky, the developer at flespi. Jan explained how flespi, under the guidance of Aliaksei Shchurko, CEO of Gurtam and head of the flespi team, is pioneering AI applications within the company. 

64

Jan Bartnitsky, developer at flespi, Gurtam

  • Jan, could you please share your background with AI. How do you work with this technology at flespi? 

When we understand the algorithm AI works in, we can apply it to any internal team’s task provided it falls into these conditions:

  1. There’s a certain similarity of inputs;

  2. The task is a repetitive one and can be performed many times within a short period of time;

  3. The task consumes lots of human resources. 

As workforce costs rise significantly within companies, the benefits become clear — businesses can reduce expenses by minimizing the time spent on routine tasks.

At flespi, we have deployed AI to automate a series of repetitive tasks related to text processing. Specifically, the AI is programmed to have read-only access to client data, preventing any actions that could disrupt service or data security, such as deleting accounts or inappropriate activations like turning off an engine in a moving vehicle.

Another important notice is the system of responsibility. flespi's AI is designed not to resolve issues autonomously but to suggest solutions based on user input, the same as when these processes were handled manually.

Understanding AI's logic allows us to identify routine tasks suited for automation. If the tasks involve repetitive text processing and require human involvement, they are ideal candidates for AI delegation. No one will be happy to perform a similar dull activity four times per day, and we at Gurtam see our staff happiness as a target, as mentioned earlier by Gurtam’s CEO, so we implement AI where it is possible without compromising data security or service quality. 

  • How is AI currently implemented in flespi’s communication tools?

Well, there are several directions where the AI is taking full force. First, implementing AI power through the technical support of our client base is the most obvious way. This approach aligns perfectly with the input-process-output framework we employ.

Second, technical content management. All the Gurtam software products are integrated with numerous telematics hardware, making up the telematics solution's core. Traditionally, adding a new hardware device description to our website was time-consuming and involved extensive research and formatting. Now, AI assists in these tasks by automating data collection from official documentation, formatting, and tagging (e.g., "e-vehicle," "IP67," "waterproof") before publication on flespi.com, with our team overseeing the final steps.

Another hardware-related application is in processing incoming device reports. Previously, interpreting these reports required extensive documentation review. AI has transformed this process by enabling intelligent search capabilities that go beyond keywords to analyze issue descriptions directly. Enhanced by RAG, our knowledge base pulls relevant data from the official device documentation to ensure high-quality outputs. This system relies heavily on the sophistication of the LLM, with around 90% of its effectiveness attributable to the LLM's quality and complexity.

These examples picture the way we optimize routine, but it is for LLM that it became possible to expand the tool's application options. We used to think that only humans could read complex data, synthesize it, and analyze it to build logical connections. With LLM, we see that this is no longer true — we now have a powerful tool that can perform these tasks, too. 

  • What’s your experience with AI so far? How has it benefited the team’s workload?
    How do the users react?

I'd like to share insights from a colleague who has been with Gurtam since the beginning. He recalls the launch of Wialon as a web application in 2007, a revolutionary move that introduced GPS tracking online and not installed on individual machines — back then, it created a significant "wow" effect.

Today, we're witnessing a similar effect with the implementation of our AI assistant in flespi (it is named Codi). Our user base consists primarily of developers who value the response times, often delivering relevant code within a minute. Moreover, users often view their interaction with the AI tool as a learning opportunity, recognizing the complexity of developing such solutions and the quality of what we've achieved.

Internally, flespi’s developer-driven team employs AI to reduce mundane tasks, and to focus on innovative work. This shift toward AI integration is reshaping our approach to routine, enabling us to explore creative solutions to eliminate them. At Gurtam, the key focus is using technology to enhance creativity and the team’s happiness, so here we’ve got the perfect match. 

Flespi Responses Chart

  • What has been the biggest challenge in adopting AI so far?

At flespi, we prioritize innovative and adaptable data storage solutions. Despite being 2024, we often come across external technical documentation still provided in PDF format — with printed layouts and complex tables. This format aligns with our clients' usage but falls short of their expectations for more innovative information architecture. This process involves correcting format errors, highlighting headings to structure the text, and preparing the document for AI use — a considerable effort for handling numerous models.

In this regard, Teltonika provides a nice example of web-based technical documentation. We can easily parse it in a required way, navigate via links, and collect the information for a quality output. What is also efficient in their way of arranging documentation is that it covers all the device models, including their specific features. Due to this and also because of the stable top position of Teltonika in terms of device connection growth rate, we first applied our AI assistance for device model specification for Teltonika telematics products. 

Another point worth mentioning is the cost associated with complex models that deliver significant benefits. Balancing quality and expense requires decision-making about when to deploy more advanced, costly models versus when basics are enough.

The graph below illustrates the daily cost variation, indicating that increased funds are necessary during periods of intensive development, which involve numerous paid tests:

Stats

Unpredictability in AI interactions also presents difficulties. Unlike human communication, which generally follows predictable patterns, AI may interpret inputs in unexpected ways. For example, our AI assistant, Codi, once mistakenly claimed to have checked logs — a function beyond its capabilities. This error was rooted in the initial instruction that Codi was a team member, leading it to assume it could perform all team functions. This issue was resolved by adjusting the AI's directives to reflect its capabilities more accurately.

  • Can you share any unexpected results or side effects that were not planned? 

It's quite ironic that we initially thought we'd be assigning tasks to robots, but now it's AI like Codi that's generating tasks for our developers. Codi now assists in creating tasks for hardware manufacturers who develop device management software.

At the start of the year, our team at the Gurtam AI Hackathon developed a solution that uses AI to generate PVM code (the internally developed programming language used within the flespi team). We input a code snippet along with descriptions and example solutions, and the AI outputs its own code. While sometimes flawed — either not compiling or performing incorrectly — the output is often well-structured, sparking creativity among our developers to refine our programming language. These 'mistakes' reveal innovative ways to approach coding.

  • flespi’s announced recently that the AI assistant is hardware agnostic and can work with numerous telematics devices, could you elaborate on this? 

Our hardware-agnostic approach is a cornerstone across all Gurtam products, ensuring seamless integration of client hardware with our software solutions. This principle extends to our adoption of AI, where the AI tool provides consulting for devices, applications, and accessories from various telematics hardware manufacturers with persistent quality and depth. It also aids telematics manufacturers' customer service teams manage client communications across diverse platforms and sizes.

All official technical documentation from telematics hardware manufacturers is uploaded into the Retrieval-Augmented Generation (RAG) system and processed efficiently to meet client support needs. For instance, if a client uses devices from multiple manufacturers, our AI can handle discrepancies in terminology — such as differences between 'odometer' and 'CAN mileage' meaning the same — to ensure effective communication. While achieving 100% alignment due to specific device details is impossible, our goal is to do our utmost.

  • Looking forward, how do you see AI evolving within flespi?

At the moment, we see it as a process that both brings joy out of achieved results and also decreases the amount of routine work. About 80% of all the resources spent on AI integration at flespi are put into technical support, so the more optimized it is, the better for the team. Moreover, the volume of support is hard to predict; sometimes, you can find yourself dealing with these tasks since they are urgent. So, we plan to evolve here with a sustaining quality level. We consider an AI-first approach for our technical support as the next step. Currently, we are around 40%, but the tendency is to increase this share so that AI can take all the tickets first, with team members joining later. 

  • Which advice would you give to others planning to implement AI?

Here are a few tips that helped me early on. First, don't stress about costs when experimenting — set aside a budget you're comfortable with to try AI. This approach allows you to experiment freely without the fear of financial loss. To give you an idea of the costs involved, consider this: during Gurtam's AI hackathon, each team received $100 for 2 days of development experiments. Worth noting that not every team fully utilized the allocated amount.

Second, AI should not just be viewed as a tool for programmers. Even if you lack coding skills, AI can generate code for you. Focus on understanding the logic and algorithms, and consider developing your own Retrieval-Augmented Generation (RAG) to address straightforward problems from the start.

Third, identify tasks that AI can handle, particularly repetitive ones, and explore integrating these with AI technology. This shift can significantly enhance your team's efficiency and transform your daily operations.