Some time ago I published an article about how to store the NetEye SMS Protocol log into an ELK environment. Now, after using it some times, I discovered that it was not completely correct as the time/date functions for the Logstash filters are a bit more complicated. In particular, it was that the date was written in the SMS protocol file in this way:
June 29th 2016, 10:30:22 CEST 2016
And we used this Logstash date filter to convert it:
date {
locale = "en"
match = [ "sms_timestamp_text", "EEE MMM dd HH:mm:ss" ]
}
Now it seemed that it would work, but after some time (some days until the start of the next month) we discovered that the date in the first days of the month would look like:
July 1th 2016, 10:30:22 CEST 2016
As we had a textual timezone and date filters do not support this, in the first draft we had this rule to be able to parse the sms_timestamp_text:
match =>[ "message", "%{SMS_TIMESTAMP_SHORT:sms_timestamp_text}
%{WORD:timezone} %{YEAR}:%{INT:sms_phonenumber}:%{GREEDYDATA:sms_text}"
We discovered that our filter would not work with that as we had “dd” for 2 digit day. Now, how would we do this as the date cannot be matched by “d”, neither by “dd”? After studying the filter rules I discovered the solution. It is possible to have “or” rules inside the date and so being able to match more than one date format. So we changed the filter in this way:
date {
locale = "en"
match => [ "sms_timestamp_text", "EEE MMM dd HH:mm:ss Z yyyy", "EEE MMM d HH:mm:ss Z yyyy" ]
}
You see the new Z and yyyy parameters because if we do not match a complete date it will not work correctly. To be able to parse this correctly now I discovered that the pattern match had to change in this way:
match => [ "message", "%{SMS_TIMESTAMP:sms_timestamp_text}:%{INT:sms_phonenumber}:%{GREEDYDATA:sms_text}" ]
As I told earlier, Logstash cannot parse textual time zones, but this is what we have here. What should we do? We know that our date is in Western Europe so we have a solution for this mutate tag in this way:
I have over 20 years of experience in the IT branch. After first experiences in the field of software development for public transport companies, I finally decided to join the young and growing team of Würth Phoenix. Initially, I was responsible for the internal Linux/Unix infrastructure and the management of CVS software. Afterwards, my main challenge was to establish the meanwhile well-known IT System Management Solution WÜRTHPHOENIX NetEye. As a Product Manager I started building NetEye from scratch, analyzing existing open source models, extending and finally joining them into one single powerful solution. After that, my job turned into a passion: Constant developments, customer installations and support became a matter of personal. Today I use my knowledge as a NetEye Senior Consultant as well as NetEye Solution Architect at Würth Phoenix.
Author
Juergen Vigna
I have over 20 years of experience in the IT branch. After first experiences in the field of software development for public transport companies, I finally decided to join the young and growing team of Würth Phoenix. Initially, I was responsible for the internal Linux/Unix infrastructure and the management of CVS software. Afterwards, my main challenge was to establish the meanwhile well-known IT System Management Solution WÜRTHPHOENIX NetEye. As a Product Manager I started building NetEye from scratch, analyzing existing open source models, extending and finally joining them into one single powerful solution. After that, my job turned into a passion: Constant developments, customer installations and support became a matter of personal. Today I use my knowledge as a NetEye Senior Consultant as well as NetEye Solution Architect at Würth Phoenix.
IPMI (Intelligent Platform Management Interface) has been the standard de facto for managing and monitoring computer hardware for many years, but as Intel explicitly stated here no further updates to the IPMI specification are planned or should be expected. "No Read More
In a previous article I analyzed how you can create effective visualizations in Kibana, and how to apply machine learning jobs with the goal of extracting as much information as possible from our data. However, you can also think of Read More
ITOA ITOA is the NetEye component which represents time series data using the Grafana graphics engine. Data is usually collected by Telegraf agents and stored into an InfluxDB specialized non-relational database. Flux In order to manipulate the time series data, Read More
The Cyber Apocalypse CTF is back with the 2022 edition. It’s a Jeopardy-style competition organized by Hack The Box and is open to everyone. Together as a security-focused guild (a concept taken from the Spotify model) we here at Würth Read More
The correct configuration and scheduling of downtime is an essential element of a monitoring system for several reasons: Mitigating notificationsProviding IT operations and Service Desk teams with timely information about when monitored systems may be subject to faults due to Read More