As a transitional step, this site will temporarily be made Read-Only from July 8th until the new community launch. During this time, you can still search and read articles and discussions.

While the community is read-only, if you have questions or issues requiring TIBCO review/response, please access the new TIBCO Community and select "Ask A Question."

You will need to register or log in or register to engage in the new community.

TIBCO LogLogic® Log Management Intelligence Architecture and Overview

By:
Last updated:
8:54pm Aug 26, 2019

Back to HomePage


TIBCO LogLogic® provides the industry's first enterprise class, end-to-end log management solution. Using LogLogic® log management solutions, IT organizations can analyze and archive log and machine data for the purpose of compliance and legal protection, decision support for security remediation, and increased system performance and improved availability of overall infrastructure.


Introduction

Exponential growth in machine data over the last decade has created a need to have tools to automate storage, search and help provide insights on this machine data. This exponential growth is partly because of the growing number of machines in the IT infrastructure and partly because of the increased use of various devices such as handheld devices, end point devices, IoT sensors, etc. Managing this data efficiently, and analyzing this data can help increase productivity and visibility for the business. Using LogLogic's® LMI, organizations can analyze and archive machine and log data for the purpose of compliance and legal protection, decision support for security remediation, and increased performance and improved availability of their infrastructure.

In this article we attempt to explain "Why should we use LogLogic®? How does it help solve the problem?". There are multiple sources in an organization that generate data. In plain sight, all machine data is in unstructured format and is not suitable for visualization, providing business insights, or for analysis purposes. This is where LogLogic® Log Management Intelligence comes into play. Here is a depiction of what LogLogic® Architecture should look like in any organization that needs machine data management.

Architecture

Architecture

 

Store & Search

This is the first layer of Log or Machine data management features that LogLogic® Log Management Intelligence (LMI) has to offer. LogLogic® LMI lets users send data from various sources over TCP(S), UDP, HTTP(S). If users would prefer for LogLogic® LMI to pull data, LogLogic® LMI's collector framework helps users collect data stored in file format to be directly pulled into LMI for analysis purposes. Both push and pull of data are supported in LMI. To see a list of sources that are supported by LMI, and how we categorize these data sources, refer to the following wiki page: https://legacycommunity.tibco.com/wiki/list-supported-log-sources-loglogic-lmi

Once the data is in LogLogic® LMI, it then stores it, indexes it, and parses it. To store data, LMI has pre-parsing intelligence that recognizes and categorizes data based on what the source type is. For example, the data can correspond to sources such as General Syslog, Window, Windows Active Directory, and so on. If an unrecognized pattern of data is pushed or pulled into LogLogic® LMI, it categorizes the data as "General Syslog". Auto-identification of sources makes it easy to add as many sources to LogLogic® LMI as needed, without users having to tell the system what IP the data source is from, and what type of data it is. Data can be stored upto 10 years local and archive depending on the data storage allocations*. This data is also indexed for users to quickly search and find relevant information. All the features mentioned are out of the box and three is no need to add any plugins to the system to enable these features.

Searching is quick and easy with LogLogic® LMI as it offers multiple ways to search for data. There are 3 possible ways to search for data in LogLogic® LMI:

  1. Regular Expression Search: In this feature, we allow users to look for the raw data in a raw format. If users send machine / log data to LMI, regular expression search lets users search for the same unstructured data and display it in a raw format. The main purpose of this feature is to allow users to perform searches based on patterns. If users would like to search for data that is in a regular expression pattern, they can do so by using this feature. To learn more about Regex Searches, refer to this YouTube video: "How to perform a Regular Expression Search on TIBCO LogLogic Log Management Intelligence"
  2. Index Search: For faster retrieval of data, LogLogic® LMI indexes the raw data that is ingested.. Index data is stored separately on the file system and is created based off of the raw data. Index searching is faster than Regular expression searching. It supports usage of keywords, operators such as AND, OR, NOT, etc. and combination of both. To learn more about index searching, refer to this YouTube video: "How to perform an Index Search on TIBCO LogLogic Log Management Intelligence"
  3. Advanced Search: Starting from version 6.1.0, LogLogic® LMI supports advanced search methods. This search works based off of indexes, but is the fastest amongst the 3 search methods. This feature offers various features like timegraph, keyword search, full text search, aggregated search with grouping of columns, modeled data search (usage of advanced data models), operators, and regular expression based search. This feature is meant to combine the best of Index search and Regular Expression search and optimize the performance of searching. To learn more about advanced search, refer to our YouTube video: "How to Use Advanced Search Feature in TIBCO LogLogic Log Management Intelligence"

Data Parsing & Modelling

If the data pushed or pulled into LogLogic® LMI is recognizable to the system, it categorizes them automatically based on the source type. Once the categorization is done, LMI deep parses the data for out of the box analysis reports. This helps users quickly and efficiently find relevant information without the noise. For example, if users are pushing all their Windows data to LogLogic® LMI. It deep parses events such as:

ParsedEvents

These built-in deep parsed reports help provide quick insights about the infrastructure. Now let's say LogLogic® LMI cannot recognize the data sent to it - "Are users going to have to settle for just store and search on those sources?". The answer is NO. LogLogic® LMI offers data models to model or structure the data. A feature that allows users to model the data, or structure the data the way they see it fitting the best. Once the data is structured using parsing rules the way users want to see the data, then all the incoming or new data is going to match these rules created and going to be categorized as the structured data. This maximizes insights and analysis that can be done on custom applications that users have and those that LogLogic® LMI cannot recognize out of the box. As the LogLogic team keeps adding support for new devices, users can get quicker access to the new devices through community data models. To find a list of data models created and published visit the following page: https://legacycommunity.tibco.com/wiki/tibco-loglogicr-log-management-intelligence-advanced-data-models

Visualizaitons and Dashboards

So far we have seen how to get the data to LogLogic® LMI, and what LogLogic® LMI can do in terms of searching or storing. Let's look at some of the visualizations LMI has to offer for the data. For any given dataset, LogLogic® LMI can not only structure the data, store, index, retrieve; but also visualize the data in dashboards format. This helps users perform critical analysis and gain quick access to aggregated data sets. Here is an example of how a visualization would look like for a data set:

SampleDashboards

These are some of the sample dashboards that can be built on LogLogic® LMI using your infrastructure data. These dashboards help get quick access to critical elements of information. LogLogic® LMI Dashboards can also provide quick insights on metrics which help gauge various aspects of an organization's infrastructure needs. Some of the popular use cases are:

 - Are we oversubscribing our servers / nodes? - Metrics to monitor would be Storage, RAM, and other system resources for a given pool of devices.

 - Are we getting the traffic that's expected? - Metrics to monitor would be MPS per source, Average Disk Space occupied per day etc.

 - Historical analysis like, how many times have we encountered this event in the past 2 years? - Metrics to monitor would be modeled columns with eventIDs or unique identifiers to that event with a count function. 

For a more detailed analysis of what can each function do, and what can LogLogic® LMI insights offer, refer to the documentation at https://docs.tibco.com/products/tibco-loglogic-enterprise-virtual-appliance

Correlation & Aggregation

As events from various sources are pushed or pulled into LogLogic® LMI, it can get difficult to perform searches in a way that you can see Event 1 from Source A and and Event 2 from Source B in a format where you can correlate the series of events. Luckily, with LogLogic® LMI's correlation features, that's exactly what you can achieve. It lets you search for events with a pattern happening in a series over a particular time frame. And alerts you when a series of events transcribe using Advanced Alerting feature. To see a quick example of how to use correlation feature refer to the following YouTube video: "How to correlate log data using TIBCO LogLogic® Log Management Intelligence"

You might need to often search a similar data set grouped by a column, you can create an aggregation and store it. This helps in faster retrieval of grouped data sets.

Example

So far we have explored some of the main features of LogLogic LMI. To put things in perspective, let's see how application or server data looks as follows:

<14>11/09/18 07:00:21	118.12.35.160	http://google.es/adipiscing/lorem/vitae/mattis/nibh/ligula/nec.html?adipiscing=nonummy&lorem=integer&vitae=non&mattis=velit&nibh=donec&ligula=diam&nec=neque&sem=vestibulum&duis=eget&aliquam=vulputate&convallis=ut&nunc=ultrices&proin=vel&at=augue&turpis=vestibulum&a=ante&pede=ipsum&posuere=primis&nonummy=in&integer=faucibus&non=orci&velit=luctus&donec=et&diam=ultrices&neque=posuere&vestibulum=cubilia&eget=curae&vulputate=donec&ut=pharetra&ultrices=magna&vel=vestibulum&augue=aliquet&vestibulum=ultrices&ante=erat&ipsum=tortor&primis=sollicitudin&in=mi&faucibus=sit&orci=amet&luctus=lobortis&et=sapien&ultrices=sapien&posuere=non&cubilia=mi&curae=integer&donec=ac&pharetra=neque&magna=duis&vestibulum=bibendum&aliquet=morbi&ultrices=non&erat=quam&tortor=nec&sollicitudin=dui&mi=luctus&sit=rutrum&amet=nulla&lobortis=tellus&sapien=in&sapien=sagittis	success	Data Access	udp	120	UK
<14>11/09/18 07:00:21	115.10.86.242	http://facebook.com/ultrices/posuere/cubilia.js?sapien=lacinia&placerat=nisi&ante=venenatis&nulla=tristique&justo=fusce&aliquam=congue&quis=diam&turpis=id&eget=ornare&elit=imperdiet&sodales=sapien&scelerisque=urna&mauris=pretium&sit=nisl	success	App_Login	ipv4	400	RU
<14>11/09/18 07:00:21	100.22.150.144	http://china.com.cn/gravida/sem/praesent/id/massa.jpg?aliquam=mattis&sit=egestas&amet=metus&diam=aenean&in=fermentum&magna=donec&bibendum=ut&imperdiet=mauris&nullam=eget&orci=massa&pede=tempor&venenatis=convallis&non=nulla&sodales=neque&sed=libero&tincidunt=convallis&eu=eget&felis=eleifend&fusce=luctus&posuere=ultricies&felis=eu&sed=nibh&lacus=quisque&morbi=id&sem=justo&mauris=sit&laoreet=amet&ut=sapien&rhoncus=dignissim&aliquet=vestibulum&pulvinar=vestibulum&sed=ante&nisl=ipsum&nunc=primis&rhoncus=in&dui=faucibus&vel=orci&sem=luctus&sed=et	success	Main_Page	tcp	100	DE
<14>11/09/18 07:00:21	167.44.192.125	https://yahoo.co.jp/primis/in/faucibus/orci.js?eget=nunc&congue=rhoncus&eget=dui&semper=vel&rutrum=sem&nulla=sed&nunc=sagittis&purus=nam&phasellus=congue&in=risus&felis=semper&donec=porta&semper=volutpat&sapien=quam&a=pede&libero=lobortis&nam=ligula&dui=sit&proin=amet&leo=eleifend&odio=pede&porttitor=libero&id=quis&consequat=orci&in=nullam&consequat=molestie&ut=nibh&nulla=in&sed=lectus&accumsan=pellentesque&felis=at&ut=nulla&at=suspendisse&dolor=potenti&quis=cras&odio=in&consequat=purus&varius=eu&integer=magna&ac=vulputate&leo=luctus&pellentesque=cum&ultrices=sociis&mattis=natoque&odio=penatibus&donec=et&vitae=magnis&nisi=dis&nam=parturient&ultrices=montes&libero=nascetur&non=ridiculus&mattis=mus&pulvinar=vivamus&nulla=vestibulum&pede=sagittis&ullamcorper=sapien&augue=cum&a=sociis&suscipit=natoque&nulla=penatibus	success	Main_Page	udp	400	IN

The above data is being sent from access point devices. Each of the above entries corresponds to a particular access point that was redirected to fetch a URL for a user, from a target server, located in a particular region, over a communication protocol. Without knowing the level of insight about this data, there is not much value to a business knowing what this raw data means. In most cases, users would not be able to perform operations that can give a 360view of the infrastructure as each of those nodes exist in individual locations of the infrastructure. It takes many man hours to mine that data, centralize it, and locate meaningful information among the noise. LogLogic® LMI, as explained in the above image, helps centralize the data in the first step. This is critical for any analysis tools. A centralized location of data from various sources helps users quickly and effectively find information about their infrastructure. By using LogLogic® LMI, users can save man hours and a lot of manual work to centralize, analyze, alert, aggregate and correlate events.

Once this data is in LogLogic® LMI, you can model this unstructured data into structured data using Data Models. For example, on the above data set, I used a Columnar Parser with a separator as TAB space and escape character as the Enter character. This was a 3 step process of finding data from this source, choosing the parser, and providing the characters for each one of those fields. When data is structured using a datamodel, the same data can be laid out as meaningful information for everyone looking at it. The following picture illustrates how this data looks in searches after modeling:

These 2 images show what the data would look like in LogLogic® LMI when it's modeled using Data Models. The bottom image shows a raw search for the data over a time period where the columns are parsed with matching events. The first image above is a tabular format of the data, where the actual message as it was sent from the access point device is hidden, but only the parsed meaningful columns along with its values are shown. These parsed columns let you perform various functions on the data like aggregate it, group the data based on a value like IP, location, activity, etc. To get a complete list of functions that LogLogic® LMI supports refer to our Search Syntax Reference" section in TIBCO LogLogic® Log Management Intelligence User's Guide. This document can be found at: https://docs.tibco.com/products/tibco-loglogic-enterprise-virtual-appliance.

Once the operations have been performed on the dataset, you can visualize the data for quicker access to metrics. I used some of the charts LMI offers to create meaningful visualizations:

 

LogLogic® LMI has much more variety of visualization widgets to offer. To get more understanding around what are some of the use cases to visualize data using and what can be used, click here.

How To Use This Sample Data And Model

Now that we have a understanding of LMI features and how to use LogLogic to improve infrastructure visibility and proactively take actions around improving efficiency and availability, let's look at how to use the sample data. To import the Data Model, users will have to deploy an LMI instance, and enable Advanced Features. Navigate to the path Home > Management > Advanced Features > Data Models and create a new Data Model. Click the "Switch to raw mode" like below:

Input the contents of the attached txt file into the right side pane. Save the Data Model and it should be ready to use. 

The sample data set used in this article is attached to this page. Users who wish to try out these features can ingest this dataset to LMI the Remote File Collection and Forwarding features of TIBCO LogLogic® Universal Collector or have LogLogic® LMI pull the file as a whole. To know more about how to install and use LogLogic® Universal Collector refer to the following 2 pages: 

Note: This article assumes users who wish to deploy LogLogic® LMI are using LMI version 6.2.0 and LSP 33 as a minimum. To learn more about how to deploy an LMI instance click here.

 


Additional Resources

Attachments

AttachmentSize
Package icon sample_dataset.csv_.zip68.54 KB
Plain text icon sampledm.txt2.72 KB