Skip to main content

What AI Can and Can’t Do For Your Observability Practice

Artificial intelligence (AI) and large language models (LLMs) have dominated the tech scene over the past year. As a byproduct, vendors in nearly every tech sector are adding AI capabilities and scrambling to promote how their products and services use it. 

This trend has also made its way to the observability and monitoring space. However, the AI solutions coming to market often feel like putting a square peg in a round hole. While AI can significantly impact certain areas of observability, it is not a fit for others. In this article, I’ll share my views on how AI can and cannot support an observability practice – at least right now.

The Long Tail of Errors

The very nature of observability makes ‘prediction’ in the traditional sense unfeasible. In life, certain ‘act of God’ types of events can impact business and are impossible to predict – weather-related events, geopolitical conflicts, pandemics, and more. These events are so rare and capricious that it’s implausible to train an AI model to predict when one is imminent.

The long tail of potential errors in application development mirrors this. In observability, many errors may happen only once, such that you may never see them happen again in your lifetime, while other types of errors may occur daily. So, if you’re looking to train a model that will completely understand and predict all the ways things could go wrong in an application development context, you’re likely to be disappointed.

Poor Quality Data

Another way that AI needs to improve in observability is its inability to make a distinction between details that are irrelevant, and those that are not. In other words, AI can pick up on small, inconsequential aberrations with a big impact on your results.

For example, previously, I worked with a customer training an AI model with hours of basketball footage to predict successful versus unsuccessful baskets. There was one big issue: all footage of an unsuccessful basket included a timestamp on the video. So, the model determined timestamps have an impact on the success of a shot (not the result we were looking for).

Observability practices often work with imperfect data – unneeded log contents, noisy data, etc. When you introduce AI without cleaning up this data, you create the possibility of false positives – as the saying goes, “garbage in and garbage out.” Ultimately, this can leave organizations in a more vulnerable position of alert fatigue.

Where AI Does Fit Observability

So, where should we be using AI in observability? One area where AI can add a lot of value is in baselining datasets and detecting anomalies. In fact, many teams have been using AI for anomaly detection for quite some time. In this use case, AI systems can, for example, understand what “normal” activity is across different seasonalities and flag when it detects an outlier. In this way, AI can give teams a proactive heads-up when something may be going awry.

Another area where AI can be helpful is by shortening the learning curve when adopting a new query language. Several vendors are currently working on natural language query translators driven by AI. A natural language translator is an excellent way to lower the entry barriers when using a new tool. It frees up practitioners to focus on the flow and the practice itself rather than the pipes, semicolons, and all other nuances that come with learning a new syntax.

What to Focus on Instead

Whether beginning a journey with AI or making any other improvement, understanding usage trends is essential to optimizing the value of an observability practice. Improving a system without understanding its usage is akin to throwing darts in a pitch-black room. If no one uses the observability system, it’s pointless to have it. Many different analytics can help you know who’s using the system and, conversely, who isn’t using the system that should be.

Practitioners should focus on usage related to the following:

  • User-generated content – are users creating alerts or dashboards? How often are they being viewed? How delayed is the data getting to these dashboards, and can this be improved?
  • Queries – how often are you running queries powering dashboards and alerts?  Are queries fast or slow, and could they be optimized for performance? Understanding and improving query speed can improve development velocity for core functions.
  • Data – what volume is stored, and from what sources? How much of the stored data is actually queried?  What are the hotspots/dead zones, and can storage be tiered in a manner so as to optimize cloud storage costs?

Closing Thoughts

I believe that AI is currently at the peak of the hype curve. In an application development setting, pretending AI does what it doesn’t do – i.e., predict root causes and recommend specific remediations – is not going to propel us to the part after all the hype when the technology actually gets useful. There are very real ways that AI can turn the gears on observability improvements today – and this is where we should be focused. 

The post What AI Can and Can’t Do For Your Observability Practice appeared first on SD Times.



from SD Times https://ift.tt/V9PnqxI

Comments

Popular posts from this blog

Difference between Web Designer and Web Developer Neeraj Mishra The Crazy Programmer

Have you ever wondered about the distinctions between web developers’ and web designers’ duties and obligations? You’re not alone! Many people have trouble distinguishing between these two. Although they collaborate to publish new websites on the internet, web developers and web designers play very different roles. To put these job possibilities into perspective, consider the construction of a house. To create a vision for the house, including the visual components, the space planning and layout, the materials, and the overall appearance and sense of the space, you need an architect. That said, to translate an idea into a building, you need construction professionals to take those architectural drawings and put them into practice. Image Source In a similar vein, web development and design work together to create websites. Let’s examine the major responsibilities and distinctions between web developers and web designers. Let’s get going, shall we? What Does a Web Designer Do?

A guide to data integration tools

CData Software is a leader in data access and connectivity solutions. It specializes in the development of data drivers and data access technologies for real-time access to online or on-premise applications, databases and web APIs. The company is focused on bringing data connectivity capabilities natively into tools organizations already use. It also features ETL/ELT solutions, enterprise connectors, and data visualization. Matillion ’s data transformation software empowers customers to extract data from a wide number of sources, load it into their chosen cloud data warehouse (CDW) and transform that data from its siloed source state, into analytics-ready insights – prepared for advanced analytics, machine learning, and artificial intelligence use cases. Only Matillion is purpose-built for Snowflake, Amazon Redshift, Google BigQuery, and Microsoft Azure, enabling businesses to achieve new levels of simplicity, speed, scale, and savings. Trusted by companies of all sizes to meet

2022: The year of hybrid work

Remote work was once considered a luxury to many, but in 2020, it became a necessity for a large portion of the workforce, as the scary and unknown COVID-19 virus sickened and even took the lives of so many people around the world.  Some workers were able to thrive in a remote setting, while others felt isolated and struggled to keep up a balance between their work and home lives. Last year saw the availability of life-saving vaccines, so companies were able to start having the conversation about what to do next. Should they keep everyone remote? Should they go back to working in the office full time? Or should they do something in between? Enter hybrid work, which offers a mix of the two. A Fall 2021 study conducted by Google revealed that over 75% of survey respondents expect hybrid work to become a standard practice within their organization within the next three years.  Thus, two years after the world abruptly shifted to widespread adoption of remote work, we are declaring 20