How to Use Advanced Log Queries¶
This guide teaches you how to leverage Peakhour's powerful log querying system to extract actionable insights from your traffic, security, and performance data using Kibana Query Language (KQL) syntax.
Before you begin: Familiarize yourself with event log format and security investigation techniques.
Understanding Peakhour's Query Language¶
Peakhour uses Kibana Query Language (KQL) to enable sophisticated filtering of log data across security events, access logs, performance metrics, and application data.
Query Structure¶
A query consists of one or more filters. A filter is composed of a field, an operator, and a value. You can combine multiple filters with logical operators.
Supported Data Types¶
String Fields: Text data with exact or pattern matching Numeric Fields: Integers and decimals with mathematical operations Timestamp Fields: Date/time data with time-based filtering Boolean Fields: True/false values with logical operations Array Fields: Lists of values with membership testing
Master Basic Query Syntax¶
Field-Based Filtering¶
Exact Match Query Examples
Exact Match Queries:
Pattern Matching Examples
Pattern Matching:
Logical Operators¶
Logical Operator Examples
AND Operations:
# Multiple conditions must be true
client:203.0.113.42 AND method:POST
# Alternative syntax
client:203.0.113.42 AND method:POST AND status:200
OR Operations:
# Any condition can be true
status:404 OR status:500
# Grouped conditions
(status:404 OR status:500) AND method:POST
NOT Operations:
Comparison Operators¶
Numeric Comparisons:
# Greater than
response_time > 1000
# Less than or equal
status <= 299
# Range queries
status >= 400 AND status < 500
# Response size filtering
bytes > 1048576 // Files larger than 1MB
String Operations:
# Starts with
path:/api/v1/*
# Contains substring
user_agent:*Chrome*
# Ends with
path:*.php
# Case-insensitive matching
host:*EXAMPLE.COM*
Filtering by Time¶
Time-based filtering is handled through the user interface, not directly within the query string. Use the time range picker to select a predefined or custom time window for your analysis. This ensures that all queries are performed against the correct data set without needing to manually specify time conditions in your query.
Advanced Filtering Techniques¶
Multi-Field Correlations¶
Geographic and Behavioral Analysis:
# High-risk country requests to admin areas
geoip_country_code:(CN OR RU OR BR) AND path:/admin/*
# Mobile traffic from specific regions
user_agent.mobile:true AND geoip_country_code:US AND method:GET
# Bot traffic from hosting providers
user_agent.bot:true AND geoip_as_organization:*Amazon*
Security Event Correlations:
# WAF blocks from same IP in short timeframe
block.by:waf AND client:203.0.113.42 AND time > now(-1h)
# Multiple attack vectors from single source
(waf.matched_rule.tags:sql OR waf.matched_rule.tags:xss) AND client:203.0.113.42
# Rate limited IPs attempting different attack types
block.by:rate_limit AND waf.matched_rule.severity:CRITICAL
Nested Field Queries¶
Complex Object Filtering:
# WAF rule details
waf.matched_rule.severity:CRITICAL AND waf.matched_rule.tags:sql
# Request headers analysis
request_headers.user-agent:*bot* AND request_headers.accept:""
# Response characteristics
response_headers.content-type:*application/json* AND status:200
Array Field Operations:
# Multiple WAF tags
waf.matched_rule.tags:(sql AND injection)
# Header presence checking
_exists_:request_headers.authorization AND method:POST
# Missing expected fields
NOT _exists_:request_headers.user-agent
Aggregations and Visualizations¶
Peakhour's log explorer provides powerful tools for aggregating and visualizing data without needing complex query syntax. Once you have filtered your logs using KQL, you can use the following UI features for deeper analysis:
- Chart Builder: Create custom charts and graphs to visualize data distributions, trends, and correlations. You can group by different fields, apply aggregate functions like count, average, sum, and more.
- Timeseries Graph: The main timeseries graph automatically shows the distribution of events over your selected time range. You can configure it to display different metrics.
- Column Statistics: The log table itself can often provide quick statistics or be sorted to find top talkers or outliers.
This UI-driven approach separates filtering (KQL) from aggregation, making it easier to explore your data interactively.
Security-Focused Query Patterns¶
Attack Detection Queries¶
SQL Injection Detection Queries
SQL Injection Analysis:
XSS Attack Analysis Queries
XSS Attack Analysis:
Bot and Crawler Analysis¶
Bot Behavior Patterns:
# Find all bot traffic
user_agent.bot:true
# Find traffic from verified bots
user_agent.bot:true AND bot.verified:true
# Find suspicious bot-like behavior from non-bots
user_agent.bot:false AND (user_agent:*curl* OR user_agent:*python-requests*)
Performance Analysis Queries¶
Response Time Analysis¶
Slow Endpoint Identification:
# Find slow requests
response_time > 2000
# Find slow successful requests on a specific path
status:200 AND path:/api/v1/data AND response_time > 1000
Cache Performance Analysis¶
Cache Hit/Miss Analysis:
# Find all cache hits
cache_status:hit
# Find cache misses for images
cache_status:miss AND response_headers.content-type:*image*
Query Optimization and Performance¶
Efficient Query Patterns¶
Use Indexed Fields First:
It's generally more efficient to filter on indexed fields like client
(IP address), status
, and host
before applying more complex wildcard or text searches.
# Efficient
status:404 AND client:203.0.113.42 AND path:/admin/*
# Less efficient - wildcard search first
path:*admin* AND status:404 AND client:203.0.113.42
Avoid Leading Wildcards:
Queries with leading wildcards (e.g., path:*some/path
) are much less performant than trailing wildcards (e.g., path:/some/path*
).
Export and Integration¶
Data Export Options¶
You can export the results of your filtered queries directly from the UI. The export will respect the KQL filter, time range, and selected columns.
API Integration¶
Automated Query Execution
Automated Query Execution:
Troubleshooting Query Issues¶
Syntax Errors¶
Problem¶
Query fails with parse errors
Solutions¶
- Check field names match available log fields
- Verify operator syntax (
:
for equals,>
for comparison) - Ensure proper grouping with parentheses
- Quote string values containing spaces or special characters
Performance Issues¶
Problem¶
Queries timeout or run slowly
Solutions¶
- Reduce the time range in the UI for initial testing.
- Use more selective filters early in the query string.
- Avoid leading wildcards where possible.
Missing Data¶
Problem¶
Expected results don't appear
Solutions¶
- Verify time range includes expected data
- Check field names and value formats
- Test with simpler filters first
- Confirm log retention settings
Best Practices¶
Query Development¶
- Start with simple filters and add complexity gradually
- Test queries on smaller time ranges first
- Use the query simulator for syntax validation
- Document complex queries for team use
Performance¶
- Always use the time picker to set the narrowest effective time range.
- Use indexed fields (like
client
,status
,host
) for primary filters. - Be mindful of performance when using wildcards, especially leading wildcards.
Security¶
- Never expose sensitive data in shared queries
- Use appropriate access controls for different query types
- Regularly audit complex automated queries
- Maintain query logs for security analysis
You now have comprehensive advanced log querying capabilities that enable deep analysis of security events, performance data, and traffic patterns using Peakhour's powerful Kibana Query Language (KQL) system.