Getting started with LivestreamIQ
Last updated
Last updated
3.1. Log in to the Admin Console
Navigate to
Enter your login credentials:
Login ID: Admin
Password: <Enter Password>
Click on the "Sign in" button
Fields:
Login ID: This field is used to enter the user's unique identifier for accessing the system. It should accept alphanumeric characters.
Password: This field is for entering the user's password. It should mask the input for security.
Login Button: Clicking this button submits the login credentials for authentication.
Description:
The Login Screen is the entry point for users to access the LivestreamIQ system. Users must provide their login credentials to proceed.
Usage:
Users should enter their assigned Login ID and Password, then click the "Login" button. If credentials are correct, they will be redirected to the dashboard.
Click on the profile icon at the top right corner of the Admin Console.
Select "Logout" from the dropdown menu.
The Dashboard provides a centralized view of key information and functionalities related to LivestreamIQ, such as current cluster status, performance metrics, and system alerts. In version 3.5.0, it now includes a robust Logs Dashboard featuring new features such as Data Masking and Enhanced Filtering Options for improved log analysis.
Cluster Status & Performance Metrics: Displays current cluster health, Kafka node metrics, and system alerts.
Logs Dashboard (New in 3.5.0 Version): A new widget that visually represents system logs from the past 24 hours.
Data Masking: Sensitive log data is automatically obfuscated. An “eye” icon allows authorized users to temporarily reveal sensitive information.
Enhanced Filtering Option: Log messages can now be filtered by log levels (ERROR, WARNING, INFO, DEBUG) using a dedicated filter panel with dynamic updates.
Graphical Overview: Time-series graphs and error trend charts present log frequency, peak times, and log level distributions.
3.4 Add New Cluster
Adding a new cluster allows users to expand their network or resources within the system. Followed by step-by-step instructions on how to add a new cluster. 1. Click on "Add New Cluster".
2. Enter the Cluster Name.
3. Choose Environment type from the dropdown list
4. To add servers:
Enter the Hostname and Port.
Click on the "Add Bootstrap Server" button.
Click on Configure Truststore.
9. Click on "Configure Connect" and provide the Kafka Name and Kafka Connect URL.
10. Click on "Configure Metrics" and select metric types from the dropdown and Port.
11. Use the buttons below to Reset, Validate, and Submit your changes.
Fields:
Cluster Name: This field is for naming the new Kafka cluster. It should accept a unique, descriptive name.
Environment Type: A dropdown to select the type of environment (e.g., Development, Production).
Hostname: This field requires the hostname of the Kafka cluster.
Port: This field requires the port number on which the Kafka cluster is running.
Add Bootstrap Server: Button to add a new bootstrap server to the list.
Configure Truststore: Button to configure SSL/TLS trust store settings.
Configure Authentication: Button to configure authentication settings.
Configure Schema Registry: Button to configure schema registry settings.
Configure Connect: Button to configure Kafka Connect settings.
Configure KSQL DB: Button to configure KSQL DB settings.
Configure Metrics: Button to configure metrics collection.
Reset Button: The button resets all fields in the form.
Validate Button: The button to validate the entered details before submission.
Submit Button: The button to submit the form and add the new cluster.
Description:
The "Add New Cluster" screen allows users to set up a new Kafka cluster by providing necessary details such as name, environment, and configuration settings.
Usage:
Users should fill in all required fields, configure necessary settings, validate the details, and then submit the form to add the new Kafka cluster to the system.
Online Cluster: Indicates clusters currently active and operational.
Offline Cluster: Indicates clusters that are currently not active.
1. Click on the cluster name (highlighted in blue) to access its details.
2. View cluster details, including partitions, broker count, active controller, version, URP (Under Replicated Partitions), ISR (In Sync Replicas), and OOS (Out of Sync Replicas).
Fields:
Broker List: Lists all brokers in the Kafka cluster.
Broker Health Status: Displays the health status of each broker, such as active, inactive, or error states.
Description:
The Brokers Dashboard provides a list of all brokers and their health statuses, allowing users to monitor the operational status of each broker in the Kafka cluster.
Usage:
Users can quickly identify any broker that might be facing issues and take necessary actions to ensure the cluster's health and performance.
For segment details, click on "Disk Utilization".
For configurations related to Broker, click on "Configs".
For the metrics of the broker, click on "Metrics".
To view topic details, click the "Topic" tab
Toggle between Internal and External Topics..
Click on a topic to see live data, messages, consumers, settings, and statistics.
Fields:
Topic Selection: Dropdown to select the Kafka topic for which messages need to be viewed.
Message List: Displays the list of messages for the selected topic.
Message Content: Shows the content and details of the selected message.
Description:
The Message Browser allows users to view and inspect the content of messages within a Kafka topic, helping in debugging and monitoring message flows.
Usage:
Users select a topic to view its messages. They can click on individual messages to see their content and details, ensuring messages are aligned according to the new framework.
Use the "Produce Message" button to send a message from LivestreamIQ.
Click the "Consumer" tab to view the consumer list and access customer information.
Fields:
Consumer Group List: Displays all the consumer groups in the Kafka cluster.
Consumer Group Details: Shows detailed information about the selected consumer group, including members and their statuses.
Description:
The Consumer Groups Dashboard provides an overview of all consumer groups within the Kafka cluster, allowing users to monitor and manage group memberships and statuses.
Usage:
Users can click on a consumer group to view its details. This information helps in managing and troubleshooting consumer group-related issues.
To access customer information, pick the consumer.
Click "Transaction_init" to examine transaction details for a specified consumer.
3.7 New Logs Dashboard (Version 3.5.0)
The new Logs Dashboard is designed to give you an interactive, graphical representation of system logs over the last 24 hours. It incorporates enhanced filtering options and data masking functionality for secure, real-time log analysis.
Data Masking:
Description: Sensitive log data is masked by default. Authorized users can use an eye icon to temporarily reveal the sensitive information.
Visual Cues: A masked field with an “eye” icon; when clicked, the data unmasking is indicated via a visual change.
Enhanced Filtering Option:
Description: Apply filters based on log levels (ERROR, WARNING, INFO, DEBUG) to quickly isolate relevant log entries.
Visual Interface: A filter panel with dropdown options; updates are reflected in real time as you choose your criteria.
Graphical Overview:
Time-Series Graphs: Illustrate log frequency trends over the previous 24 hours.
Log Level Distribution: Pie chart displaying the proportion of logs across various log levels.
Navigate to the Dashboard Page where the Logs Dashboard widget is located.
Apply filters using the dropdown menus to select a specific log level.
Observe the logs updating in real time based on your selected filters.
Use the eye icon to unmask sensitive information if you have the required permissions.
Interact with the charts for a more detailed breakdown of log data.
Usage: The Logs Dashboard helps you quickly identify issues, monitor system performance, and protect sensitive log details while still providing full visibility when needed.
To configure authentication, click Configure Authentication and select an authentication method from the drop-down menu.
To configure Schema Registry, select Configure Schema Registry and enter the URL to configure.
8. To configure Kafka Connect, click on Configure Connect and provide the Kafka Name and Kafka Connect URL.
Error Trends Chart: Visualizes error log trends to aid in quick identification of recurring issues.