Next Steps
Your basic DataStream pipeline is successfully processing data! Here's how to expand and customize your deployment to meet growing requirements.
Customize Your Pipeline
Your installed template provides excellent baseline functionality, but you can customize it for your specific needs:
Access Pipeline Configuration
- Navigate to: My Pipelines
- Find your pipeline: Click on your installed Content Hub template
- Edit configuration: Switch to Pipeline Overview tab and click the edit (pencil) icon
Common Customizations
Add Field Mapping:
# Map custom fields to standard names
- rename:
field: src_ip
target_field: source.ip
- rename:
field: dst_ip
target_field: destination.ip
Include Data Filtering:
# Drop debugging messages
- drop:
if: 'log.level == "debug"'
# Keep only security events
- drop:
if: 'event.category != "security"'
Add GeoIP Enrichment:
# Add geographic information for IP addresses
- geoip:
field: source.ip
target_field: source.geo
Custom Field Parsing:
# Parse custom application logs
- grok:
field: message
patterns:
- "User %{USERNAME:user.name} performed %{WORD:event.action} on %{DATA:file.path}"
Create Child Pipelines
Break complex processing into modular components:
-
Add Child Pipeline:
- Click Add new child pipeline in your main pipeline
- Name it descriptively (e.g., "security-event-enrichment")
-
Delegate Specific Tasks:
- Route certain log types to specialized processing
- Handle different data formats with dedicated parsers
- Separate enrichment logic from core parsing
-
Chain Pipeline Processing:
# In main pipeline, call child pipeline
- pipeline:
name: security-event-enrichment
if: 'event.category == "security"'
Learn More: Pipeline Configuration Guide
Scale Your Data Collection
Add More Devices
Windows Event Logs:
- Go to Fleet Management → Devices → Windows
- Deploy agents to Windows servers for security event collection
- Configure log types: All, Common, Minimal, or Custom
- Monitor authentication, process creation, and system events
Network Security Devices:
- Firewalls: Configure Palo Alto, Cisco ASA, Fortinet devices to send syslogs
- IDS/IPS Systems: Connect Snort, Suricata, or commercial detection systems
- Network Equipment: Collect logs from switches, routers, load balancers
Cloud Services Integration:
- Azure Monitor: Connect Azure Activity Logs and diagnostic data
- AWS CloudTrail: Ingest AWS API and management events
- Microsoft 365: Collect Office 365 audit and security logs
Application Data:
- HTTP Endpoints: Accept webhook data from applications
- Database Logs: Collect audit logs from SQL Server, Oracle, PostgreSQL
- Web Server Logs: Process Apache, IIS, Nginx access and error logs
Add More Targets
Microsoft Sentinel Integration:
- Purpose: Send security data to Microsoft's cloud SIEM
- Configuration: Azure subscription, Sentinel workspace, Data Collection Rules
- Benefits: Advanced analytics, threat detection, incident response
Azure Blob Storage:
- Purpose: Long-term archival and compliance storage
- Features: Automatic lifecycle management, cost-effective storage tiers
- Use Cases: Regulatory retention, historical analysis, data lake storage
Azure Data Explorer:
- Purpose: Real-time analytics and operational dashboards
- Query Language: Powerful KQL for interactive data exploration
- Visualization: Built-in charts, graphs, and dashboard creation
Parallel Processing Through Multiple Routes
To send data to multiple destinations, create separate routes for each target destination rather than configuring multiple targets within a single route.
Creating Multiple Routes for Same Data:
-
Create First Route
- Route name: "security-to-sentinel"
- Devices: Select your security log devices
- Selected pipeline: Choose your security processing pipeline
- Target: Microsoft Sentinel
-
Create Second Route
- Route name: "security-to-storage"
- Devices: Select the same security log devices
- Selected pipeline: Choose the same security processing pipeline
- Target: Azure Blob Storage
-
Create Third Route (if needed)
- Route name: "security-to-analytics"
- Devices: Select the same security log devices
- Selected pipeline: Choose the same security processing pipeline
- Target: Azure Data Explorer
Route Strategy Options:
Quick Routes for Simple Scenarios:
- Use Quick Routes when sending the same processed data to multiple destinations
- Duplicate route configuration with different target selections
- Suitable for basic parallel processing needs
Advanced Routes for Complex Scenarios:
- Use Advanced Routes when different destinations need different filtering
- Apply specific conditions for each destination
- Example: Send high-severity events to real-time alerting, all events to storage
Advanced Route with Filtering:
# Route 1: Critical events to real-time target
- Route name: "critical-alerts"
- Filter: log.level == "error" || event.severity == "high"
- Target: Real-time alerting system
# Route 2: All events to archive storage
- Route name: "complete-archive"
- Filter: (no filter - processes all events)
- Target: Long-term storage
This approach ensures each route handles one specific data flow path, making configuration clearer and troubleshooting simpler.
Learn More:
Implement Advanced Routing
Move beyond basic routes to sophisticated data flow management:
Conditional Routing
Route by Data Content:
-
Navigate to Advanced Routes
- Go to Routes → Advanced Routes
- Click Add new route
-
Configure Route with Filtering
- Route name: "critical-alerts"
- Description: "High-priority events requiring immediate attention"
- Filter:
log.level == "error" || event.severity == "high"
- Selected pipeline: Choose your alert enrichment pipeline
- Devices: Select all relevant devices
- Targets: Select your real-time alerting target
-
Create Additional Filtered Routes
- Route name: "normal-processing"
- Filter:
log.level != "error" && event.severity != "high"
- Selected pipeline: Choose your standard processing pipeline
- Targets: Select your batch storage target
Route by Source System:
-
Create Network Device Route
- Filter:
host.type == "network"
- Selected pipeline: Choose network log parser
- Targets: Network monitoring destinations
- Filter:
-
Create Security Device Route
- Filter:
host.type == "security"
- Selected pipeline: Choose security log parser
- Targets: Security analysis platforms
- Filter:
Multi-Target Routes
Parallel Processing:
-
Create Multi-Destination Route
- Route name: "multi-destination"
- Devices: Select your syslog collector devices
- Selected pipeline: Choose universal parser pipeline
- Multiple Targets: Add multiple targets:
- Microsoft Sentinel (for security analysis)
- Azure Blob Storage (for long-term retention)
- Local File Storage (for local backup)
-
Configure Target Priority
- Use the Advanced Routes interface to set target processing order
- Configure error handling if one target fails
- Set up retry policies for each destination
Conditional Target Selection:
-
Route Setup
- Route name: "classified-routing"
- Selected pipeline: Choose data classification pipeline
- Advanced Configuration: Use the route configuration tab
-
Target Conditions (configured in Route Configuration tab)
- Set conditions for sensitive data routing
- Configure default target for unclassified data
- Define fallback targets for processing failures
Load Balancing
Distribute Processing Across Multiple Directors:
DataStream supports load balancing through clustered Director deployments:
-
Deploy Multiple Directors
- Follow Cluster Deployment guidelines
- Set up Directors across different servers or regions
- Configure shared storage and coordination
-
Configure Load Distribution
- Use DNS round-robin for device connections
- Implement load balancer for HTTP/TCP devices
- Configure Director affinity for specific data sources
-
Implement High Availability
- Set up Director failover mechanisms
- Configure automatic scaling based on processing load
- Monitor Director health and performance
Learn More:
- Advanced Routes Management - Detailed UI operations and configuration
- Route Configuration - Complete filtering syntax and expressions
- Cluster Deployment - Multi-Director load balancing setup
- Deployment Overview - Infrastructure planning and scaling strategies
Manage Your Organization
As your deployment grows, implement proper governance:
User Management
Add Team Members:
- Go to Organization → Users
- Click Add new user
- Configure roles and permissions appropriately
- Send invitation emails to new team members
Role Assignment:
- Owner: Full control, can transfer ownership
- Admin: Full access except ownership transfer
- Contributor: Can edit configurations but not delete
- User: Read-only access to view configurations
Access Control
Permissions Matrix:
- Fleet Management: Control who can modify devices, targets, directors
- Pipeline Management: Restrict who can edit processing logic
- Route Configuration: Control data flow management access
- User Administration: Limit who can add/remove team members
Monitoring and Auditing
Activity Tracking:
- Go to Organization → Audit
- Review user actions and configuration changes
- Monitor login activities and access patterns
- Export audit logs for compliance reporting
Performance Dashboards:
- Track data volumes and processing rates across all directors
- Monitor system health and resource utilization
- Set up alerting for critical processing failures
- Generate capacity planning reports
Learn More: Organization Guide
Integration Scenarios
Security Operations Center (SOC)
Data Sources:
- Firewalls, IDS/IPS, endpoint protection systems
- Domain controllers, authentication servers
- Network monitoring and threat detection platforms
Processing:
- Normalize to ASIM format for consistent analysis
- Enrich with threat intelligence and user context
- Correlate events across multiple security tools
Outputs:
- Microsoft Sentinel for advanced analytics and SOAR
- Azure Blob Storage for forensic analysis and retention
- Real-time alerting for critical security events
IT Operations Management
Data Sources:
- Windows/Linux system logs from servers
- Application logs from business-critical services
- Infrastructure monitoring and performance metrics
Processing:
- Parse error messages and performance indicators
- Extract operational KPIs and SLA metrics
- Filter noise and focus on actionable insights
Outputs:
- Azure Data Explorer for operational dashboards
- File storage for historical trend analysis
- Integration with monitoring tools for alerting
Compliance and Audit
Data Sources:
- Database audit logs and access records
- File access logs and configuration changes
- Privileged account usage and administrative actions
Processing:
- Enrich with user identity and risk context
- Classify data sensitivity and access patterns
- Generate compliance-ready audit trails
Outputs:
- Tamper-proof archival storage for regulatory retention
- Real-time alerting for high-risk activities
- Automated compliance reporting and dashboards
Getting Help and Staying Updated
Documentation Resources
- Component Guides: Use sidebar navigation for detailed configuration options
- Search Function: Find specific topics and troubleshooting information
- Example Configurations: Real-world deployment patterns and templates
Community and Support
- Knowledge Base: Common questions, best practices, and troubleshooting guides
- Community Forum: Connect with other DataStream users and share experiences
- Professional Support: Enterprise support options for critical deployments
Product Updates
- Release Notes: New features, improvements, and bug fixes
- Content Hub Updates: New pipeline templates for additional device types and vendors
- Best Practice Evolution: Updated recommendations based on customer feedback
Congratulations!
You've successfully built your first DataStream pipeline and learned the foundation for sophisticated data processing workflows. Your journey from raw logs to actionable insights is well underway.
What You've Accomplished:
- ✅ Created your DataStream account and cloud presence
- ✅ Deployed and connected a managed Director
- ✅ Configured your first data source device
- ✅ Set up data output destination with proper formatting
- ✅ Installed professional-grade processing templates
- ✅ Connected components with functional data flow routes
- ✅ Verified end-to-end data processing and transformation
- ✅ Learned monitoring and troubleshooting techniques
- ✅ Explored scaling and customization options
Your Next Adventure:
Choose your path forward based on your needs:
- Security Focus: Integrate with Microsoft Sentinel, add threat intelligence enrichment
- Operations Focus: Connect infrastructure monitoring, build operational dashboards
- Scale Focus: Add more data sources, implement advanced routing, deploy clustered Directors
- Compliance Focus: Implement audit trails, long-term retention, automated reporting
The powerful data processing infrastructure you need is now at your fingertips. Welcome to the DataStream community!