Compare commits

...

4 Commits

Author SHA1 Message Date
aks07
724515117d feat: add trace details v3 flamegraph
Canvas-based flamegraph with:
- Zoom/pan via drag with scroll support
- Span hover cards with event tooltips
- Click to select span (drag detection)
- Web worker for visual layout computation
- TimelineV3 ruler component
- V3 color palette for service differentiation
- Flamegraph limit parameter (120k spans)
2026-04-10 09:56:38 +05:30
Nikhil Soni
e543776efc chore: send obfuscate query in the clickhouse query panel update (#10848)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
* chore: send query in the clickhouse query panel update

* chore: obfuscate query to avoid sending sensitive values
2026-04-09 14:15:10 +00:00
Pandey
621127b7fb feat(audit): wire auditor into DI graph and service lifecycle (#10891)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
* feat(audit): wire auditor into DI graph and service lifecycle

Register the auditor in the factory service registry so it participates
in application lifecycle (start/stop/health). Community uses noopauditor,
enterprise uses otlphttpauditor with licensing gate. Pass the auditor
instance to the audit middleware instead of nil.

* feat(audit): use NamedMap provider pattern with config-driven selection

Switch from single-factory callback to NamedMap + factory.NewProviderFromNamedMap
so the config's Provider field selects the auditor implementation. Add
NewAuditorProviderFactories() with noop as the community default. Enterprise
extends the map with otlphttpauditor. Add auditor section to conf/example.yaml
and set default provider to "noop" in config.

* chore: move auditor config to end of example.yaml
2026-04-09 11:44:05 +00:00
Pandey
0648cd4e18 feat(audit): add telemetry audit query infrastructure (#10811)
* feat(audit): add telemetry audit query infrastructure

Add pkg/telemetryaudit/ with tables, field mapper, condition builder,
and statement builder for querying audit logs from signoz_audit database.
Add SourceAudit to source enum and integrate audit key resolution
into the metadata store.

* chore: address review comments

Comment out SourceAudit from Enum() until frontend is ready.
Use actual audit table constants in metadata test helpers.

* fix(audit): align field mapper with actual audit DDL schema

Remove resources_string (not in audit table DDL).
Add event_name as intrinsic column.
Resource context resolves only through the resource JSON column.

* feat(audit): add audit field value autocomplete support

Wire distributed_tag_attributes_v2 for signoz_audit into the
metadata store. Add getAuditFieldValues() and route SignalLogs +
SourceAudit to it in GetFieldValues().

* test(audit): add statement builder tests

Cover all three request types (list, time series, scalar) with
audit-specific query patterns: materialized column filters, AND/OR
conditions, limit CTEs, and group-by expressions.

* refactor(audit): inline field key map into test file

Remove test_data.go and inline the audit field key map directly
into statement_builder_test.go with a compact helper function.

* style(audit): move column map to const.go, use sqlbuilder.As in metadata

Move logsV2Columns from field_mapper.go to const.go to colocate all
column definitions. Switch getAuditKeys() to use sb.As() instead of
raw string formatting. Fix FieldContext alignment.

* fix(audit): align table names with schema migration

Migration uses logs/distributed_logs (not logs_v2/distributed_logs_v2).
Rename LogsV2TableName to LogsTableName and LogsV2LocalTableName to
LogsLocalTableName to match the actual signoz_audit DDL.

* feat(audit): add integration test fixture for audit logs

AuditLog fixture inserts into all 5 signoz_audit tables matching
the schema migration DDL: distributed_logs (no resources_string,
has event_name), distributed_logs_resource, distributed_tag_attributes_v2,
distributed_logs_attribute_keys, distributed_logs_resource_keys.

* fix(audit): rename tag_attributes_v2 to tag_attributes

Migration uses tag_attributes/distributed_tag_attributes (no _v2
suffix). Rename constants and update all references including the
integration test fixture.

* feat(audit): wire audit statement builder into querier

Add auditStmtBuilder to querier struct and route LogAggregation
queries with source=audit to it in all three dispatch locations
(main query, live tail, shiftedQuery). Create and wire the full
audit query stack in signozquerier provider.

* test(audit): add integration tests for audit log querying

Cover the documented query patterns: list all events, filter by
principal ID, filter by outcome, filter by resource name+ID,
filter by principal type, scalar count for alerting, and
isolation test ensuring audit data doesn't leak into regular logs.

* fix(audit): revert sb.As in getAuditKeys, fix fixture column_names

Revert getAuditKeys to use raw SQL strings instead of sb.As() which
incorrectly treated string literals as column references. Add explicit
column_names to all ClickHouse insert calls in the audit fixture.

* fix(audit): remove debug assertion from integration test

* feat(audit): internalize resource filter in audit statement builder

Build the resource filter internally pointing at
signoz_audit.distributed_logs_resource. Add LogsResourceTableName
constant. Remove resourceFilterStmtBuilder from constructor params.
Update test expectations to use the audit resource table.

* fix(audit): rename resource.name to resource.kind, move to resource attributes

Align with schema change from SigNoz/signoz#10826:
- signoz.audit.resource.name renamed to signoz.audit.resource.kind
- resource.kind and resource.id moved from event attributes to OTel
  Resource attributes (resource JSON column)
- Materialized columns reduced from 7 to 5 (resource.kind and
  resource.id no longer materialized)

* refactor(audit): use pytest.mark.parametrize for filter integration tests

Consolidate filter test functions into a single parametrized test.
6/8 tests passing; resource kind+ID filter and scalar count need
further investigation (resource filter JSON key extraction with
dotted keys, scalar response format).

* fix(audit): add source to resource filter for correct metadata routing

Add source param to telemetryresourcefilter.New so the resource
filter's key selectors include Source when calling GetKeysMulti.
Without this, audit resource keys route to signoz_logs metadata
tables instead of signoz_audit. Fix scalar test to use table
response format (columns+data, not rows).

* refactor(audit): reuse querier fixtures in integration tests

Add source param to BuilderQuery and build_scalar_query in the
querier fixture. Replace custom _build_audit_query and
_build_audit_ts_query helpers with BuilderQuery and
build_scalar_query from the shared fixtures.

* refactor(audit): remove wrapper helpers, inline make_query_request calls

Remove _query_audit_raw and _query_audit_scalar helpers. Use
make_query_request, BuilderQuery, and build_scalar_query directly.
Compute time window at test execution time via _time_window() to
avoid stale module-level timestamps.

* refactor(audit): inline _time_window into test functions

* style(audit): use snake_case for pytest parametrize IDs

* refactor(audit): inline DEFAULT_ORDER using build_order_by

Use build_order_by from querier fixtures instead of OrderBy/
TelemetryFieldKey dataclasses. Allow BuilderQuery.order to accept
plain dicts alongside OrderBy objects.

* refactor(audit): inline all data setup, use distinct scenarios per test

Remove _insert_standard_audit_events helper. Each test now owns its
data: list_all uses alert-rule/saved-view/user resource types,
scalar_count uses multiple failures from different principals (count=2),
leak test uses a single organization event. Parametrized filter tests
keep the original 5-event dataset.

* fix(audit): remove silent empty-string guards in metadata store

Remove guards that silently returned nil/empty when audit DB params
were empty. All call sites now pass real constants, so misconfiguration
should fail loudly rather than produce silent empty results.

* style(audit): remove module docstring from integration test

* style: formatting fix in tables file

* style: formatting fix in tables file

* fix: add auditStmtBuilder nil param to querier_test.go

* fix: fix fmt
2026-04-09 08:12:32 +00:00
67 changed files with 7835 additions and 16 deletions

View File

@@ -8,6 +8,7 @@ import (
"github.com/SigNoz/signoz/cmd"
"github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/auditor"
"github.com/SigNoz/signoz/pkg/authn"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/authz/openfgaauthz"
@@ -93,6 +94,9 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
func(_ licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
return noopgateway.NewProviderFactory()
},
func(_ licensing.Licensing) factory.NamedMap[factory.ProviderFactory[auditor.Auditor, auditor.Config]] {
return signoz.NewAuditorProviderFactories()
},
func(ps factory.ProviderSettings, q querier.Querier, a analytics.Analytics) querier.Handler {
return querier.NewHandler(ps, q, a)
},

View File

@@ -8,6 +8,7 @@ import (
"github.com/spf13/cobra"
"github.com/SigNoz/signoz/cmd"
"github.com/SigNoz/signoz/ee/auditor/otlphttpauditor"
"github.com/SigNoz/signoz/ee/authn/callbackauthn/oidccallbackauthn"
"github.com/SigNoz/signoz/ee/authn/callbackauthn/samlcallbackauthn"
"github.com/SigNoz/signoz/ee/authz/openfgaauthz"
@@ -24,6 +25,7 @@ import (
enterprisezeus "github.com/SigNoz/signoz/ee/zeus"
"github.com/SigNoz/signoz/ee/zeus/httpzeus"
"github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/auditor"
"github.com/SigNoz/signoz/pkg/authn"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/errors"
@@ -133,6 +135,13 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
func(licensing licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
return httpgateway.NewProviderFactory(licensing)
},
func(licensing licensing.Licensing) factory.NamedMap[factory.ProviderFactory[auditor.Auditor, auditor.Config]] {
factories := signoz.NewAuditorProviderFactories()
if err := factories.Add(otlphttpauditor.NewFactory(licensing, version.Info)); err != nil {
panic(err)
}
return factories
},
func(ps factory.ProviderSettings, q querier.Querier, a analytics.Analytics) querier.Handler {
communityHandler := querier.NewHandler(ps, q, a)
return eequerier.NewHandler(ps, q, communityHandler)

View File

@@ -364,3 +364,34 @@ serviceaccount:
analytics:
# toggle service account analytics
enabled: true
##################### Auditor #####################
auditor:
# Specifies the auditor provider to use.
# noop: discards all audit events (community default).
# otlphttp: exports audit events via OTLP HTTP (enterprise).
provider: noop
# The async channel capacity for audit events. Events are dropped when full (fail-open).
buffer_size: 1000
# The maximum number of events per export batch.
batch_size: 100
# The maximum time between export flushes.
flush_interval: 1s
otlphttp:
# The target scheme://host:port/path of the OTLP HTTP endpoint.
endpoint: http://localhost:4318/v1/logs
# Whether to use HTTP instead of HTTPS.
insecure: false
# The maximum duration for an export attempt.
timeout: 10s
# Additional HTTP headers sent with every export request.
headers: {}
retry:
# Whether to retry on transient failures.
enabled: true
# The initial wait time before the first retry.
initial_interval: 5s
# The upper bound on backoff interval.
max_interval: 30s
# The total maximum time spent retrying.
max_elapsed_time: 60s

View File

@@ -227,7 +227,7 @@ func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*h
s.config.APIServer.Timeout.Default,
s.config.APIServer.Timeout.Max,
).Wrap)
r.Use(middleware.NewAudit(s.signoz.Instrumentation.Logger(), s.config.APIServer.Logging.ExcludedRoutes, nil).Wrap)
r.Use(middleware.NewAudit(s.signoz.Instrumentation.Logger(), s.config.APIServer.Logging.ExcludedRoutes, s.signoz.Auditor).Wrap)
r.Use(middleware.NewComment().Wrap)
apiHandler.RegisterRoutes(r, am)

View File

@@ -0,0 +1,4 @@
.timeline-v3-container {
// flex: 1;
overflow: visible;
}

View File

@@ -0,0 +1,87 @@
import { useEffect, useState } from 'react';
import { useMeasure } from 'react-use';
import { useIsDarkMode } from 'hooks/useDarkMode';
import {
getIntervals,
getMinimumIntervalsBasedOnWidth,
Interval,
} from './utils';
import './TimelineV3.styles.scss';
interface ITimelineV3Props {
startTimestamp: number;
endTimestamp: number;
timelineHeight: number;
offsetTimestamp: number;
}
function TimelineV3(props: ITimelineV3Props): JSX.Element {
const {
startTimestamp,
endTimestamp,
timelineHeight,
offsetTimestamp,
} = props;
const [intervals, setIntervals] = useState<Interval[]>([]);
const [ref, { width }] = useMeasure<HTMLDivElement>();
const isDarkMode = useIsDarkMode();
useEffect(() => {
const spread = endTimestamp - startTimestamp;
if (spread < 0) {
return;
}
const minIntervals = getMinimumIntervalsBasedOnWidth(width);
const intervalisedSpread = (spread / minIntervals) * 1.0;
const intervals = getIntervals(intervalisedSpread, spread, offsetTimestamp);
setIntervals(intervals);
}, [startTimestamp, endTimestamp, width, offsetTimestamp]);
if (endTimestamp < startTimestamp) {
console.error(
'endTimestamp cannot be less than startTimestamp',
startTimestamp,
endTimestamp,
);
return <div />;
}
const strokeColor = isDarkMode ? ' rgb(192,193,195,0.8)' : 'black';
return (
<div ref={ref as never} className="timeline-v3-container">
<svg
width={width}
height={timelineHeight * 2.5}
xmlns="http://www.w3.org/2000/svg"
overflow="visible"
>
{intervals &&
intervals.length > 0 &&
intervals.map((interval, index) => (
<g
transform={`translate(${(interval.percentage * width) / 100},0)`}
key={`${interval.percentage + interval.label + index}`}
textAnchor="middle"
fontSize="0.6rem"
>
<text
x={index === intervals.length - 1 ? -10 : 0}
y={timelineHeight * 2}
fill={strokeColor}
>
{interval.label}
</text>
<line y1={0} y2={timelineHeight} stroke={strokeColor} strokeWidth="1" />
</g>
))}
</svg>
</div>
);
}
export default TimelineV3;

View File

@@ -0,0 +1,93 @@
import {
IIntervalUnit,
Interval,
INTERVAL_UNITS,
resolveTimeFromInterval,
} from 'components/TimelineV2/utils';
import { toFixed } from 'utils/toFixed';
export type { Interval };
/** Fewer intervals than TimelineV2 for a cleaner flamegraph ruler. */
export function getMinimumIntervalsBasedOnWidth(width: number): number {
if (width < 640) {
return 3;
}
if (width < 768) {
return 4;
}
if (width < 1024) {
return 5;
}
return 6;
}
/**
* Computes timeline intervals with offset-aware labels.
* Labels reflect absolute time from trace start (offsetTimestamp + elapsed),
* so when zoomed into a window, the first tick shows e.g. "50ms" not "0ms".
*/
export function getIntervals(
intervalSpread: number,
baseSpread: number,
offsetTimestamp: number,
): Interval[] {
const integerPartString = intervalSpread.toString().split('.')[0];
const integerPartLength = integerPartString.length;
const intervalSpreadNormalized =
intervalSpread < 1.0
? intervalSpread
: Math.floor(Number(integerPartString) / 10 ** (integerPartLength - 1)) *
10 ** (integerPartLength - 1);
// Unit must suit both: (1) tick granularity (intervalSpread) and (2) label magnitude
// (offsetTimestamp). When zoomed deep into a trace, labels show offsetTimestamp + elapsed,
// so we must pick a unit where that value is readable (e.g. "500.00s" not "500000.00ms").
const valueForUnitSelection = Math.max(offsetTimestamp, intervalSpread);
let intervalUnit: IIntervalUnit = INTERVAL_UNITS[0];
for (let idx = INTERVAL_UNITS.length - 1; idx >= 0; idx -= 1) {
const standardInterval = INTERVAL_UNITS[idx];
if (valueForUnitSelection * standardInterval.multiplier >= 1) {
intervalUnit = INTERVAL_UNITS[idx];
break;
}
}
const intervals: Interval[] = [
{
label: `${toFixed(
resolveTimeFromInterval(offsetTimestamp, intervalUnit),
2,
)}${intervalUnit.name}`,
percentage: 0,
},
];
let tempBaseSpread = baseSpread;
let elapsedIntervals = 0;
while (tempBaseSpread && intervals.length < 20) {
let intervalTime: number;
if (tempBaseSpread <= 1.5 * intervalSpreadNormalized) {
intervalTime = elapsedIntervals + tempBaseSpread;
tempBaseSpread = 0;
} else {
intervalTime = elapsedIntervals + intervalSpreadNormalized;
tempBaseSpread -= intervalSpreadNormalized;
}
elapsedIntervals = intervalTime;
const labelTime = offsetTimestamp + intervalTime;
intervals.push({
label: `${toFixed(resolveTimeFromInterval(labelTime, intervalUnit), 2)}${
intervalUnit.name
}`,
percentage: (intervalTime / baseSpread) * 100,
});
}
return intervals;
}

View File

@@ -33,6 +33,125 @@ const themeColors = {
purple: '#800080',
cyan: '#00FFFF',
},
traceDetailColorsV3: {
// Blues
dodgerBlue: '#2F80ED',
royalBlue: '#3366E6',
steelBlue: '#4682B4',
// Teals / Cyans
turquoise: '#00CEC9',
lagoon: '#1ABC9C',
cyanBright: '#22A6F2',
// Greens
emeraldGreen: '#27AE60',
mediumSeaGreen: '#3CB371',
limeGreen: '#A3E635',
// Yellows / Golds
festivalYellow: '#F2C94C',
sunflower: '#FFD93D',
warmAmber: '#FFCA28',
// Purples / Violets
mediumPurple: '#BB6BD9',
royalPurple: '#9B51E0',
orchid: '#DA77F2',
// Accent
neonViolet: '#C77DFF',
electricPurple: '#6C5CE7',
arcticBlue: '#48DBFB',
// Blues extended
blue1: '#1F63E0',
blue2: '#3A7AED',
blue3: '#5A9DF5',
blue4: '#2874A6',
blue5: '#2E86C1',
blue6: '#3498DB',
// Cyans
cyan1: '#00B0AA',
cyan2: '#33D6C2',
cyan3: '#66E9DA',
// Greens extended
green1: '#1E8449',
green2: '#2ECC71',
green3: '#58D68D',
green4: '#229954',
green5: '#27AE60',
green6: '#52BE80',
// Forest
forest1: '#27AE60',
forest2: '#2ECC71',
forest3: '#58D68D',
// Lime
lime1: '#A3E635',
lime2: '#B9F18D',
lime3: '#D4FFB0',
// Teals
teal1: '#009688',
teal2: '#1ABC9C',
teal3: '#48C9B0',
teal4: '#1ABC9C',
teal5: '#48C9B0',
teal6: '#76D7C4',
// Yellows
yellow1: '#F1C40F',
yellow2: '#F7DC6F',
yellow3: '#F9E79F',
// Gold
gold1: '#F39C12',
gold2: '#F1C40F',
gold3: '#F7DC6F',
gold4: '#B7950B',
gold5: '#F1C40F',
gold6: '#F4D03F',
// Mustard
mustard1: '#F1C40F',
mustard2: '#F7DC6F',
mustard3: '#F9E79F',
// Aqua
aqua1: '#00BFFF',
aqua2: '#1E90FF',
aqua3: '#63B8FF',
// Purple extended
purple1: '#8E44AD',
purple2: '#9B59B6',
purple3: '#BB8FCE',
violet1: '#8E44AD',
violet2: '#9B59B6',
violet3: '#BB8FCE',
violet4: '#7D3C98',
violet5: '#8E44AD',
violet6: '#9B59B6',
// Lavender
lavender1: '#9B59B6',
lavender2: '#AF7AC5',
lavender3: '#C39BD3',
// Oranges (safe ones, not red-ish)
orange4: '#D35400',
orange5: '#E67E22',
orange6: '#EB984E',
coral1: '#E67E22',
coral2: '#F39C12',
coral3: '#F5B041',
},
chartcolors: {
// Blues (3)
dodgerBlue: '#2F80ED',

View File

@@ -677,6 +677,18 @@ function NewWidget({
queryType: currentQuery.queryType,
isNewPanel,
dataSource: currentQuery?.builder?.queryData?.[0]?.dataSource,
...(currentQuery.queryType === EQueryType.CLICKHOUSE && {
clickhouseQueryCount: currentQuery.clickhouse_sql.length,
clickhouseQueries: currentQuery.clickhouse_sql.map((q) => ({
name: q.name,
query: (q.query ?? '')
.replace(/--[^\n]*/g, '') // strip line comments
.replace(/\/\*[\s\S]*?\*\//g, '') // strip block comments
.replace(/'(?:[^'\\]|\\.|'')*'/g, "'?'") // replace single-quoted strings (handles \' and '' escapes)
.replace(/\b\d+(?:\.\d+)?(?:[eE][+-]?\d+)?\b/g, '?'), // replace numeric literals (int, float, scientific)
disabled: q.disabled,
})),
}),
});
setSaveModal(true);
// eslint-disable-next-line react-hooks/exhaustive-deps

View File

@@ -0,0 +1,60 @@
.event-tooltip-content {
font-family: Inter, sans-serif;
font-size: 12px;
color: #fff;
max-width: 300px;
&__header {
display: inline-flex;
align-items: center;
gap: 4px;
background: rgba(255, 255, 255, 0.1);
border-radius: 3px;
padding: 2px 6px;
font-size: 10px;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.5px;
color: rgba(255, 255, 255, 0.7);
margin-bottom: 6px;
}
&__name {
font-weight: 600;
margin-bottom: 2px;
color: rgb(14, 165, 233);
&.error {
color: rgb(239, 68, 68);
}
}
&__time {
font-size: 11px;
opacity: 0.8;
margin-bottom: 4px;
}
&__divider {
border-top: 1px solid rgba(255, 255, 255, 0.1);
margin: 6px 0;
}
&__attributes {
font-size: 11px;
}
&__kv {
margin-bottom: 2px;
line-height: 1.4;
word-break: break-all;
}
&__key {
opacity: 0.6;
}
&__value {
opacity: 0.9;
}
}

View File

@@ -0,0 +1,49 @@
import { convertTimeToRelevantUnit } from 'container/TraceDetail/utils';
import { Diamond } from 'lucide-react';
import { toFixed } from 'utils/toFixed';
import './EventTooltipContent.styles.scss';
export interface EventTooltipContentProps {
eventName: string;
timeOffsetMs: number;
isError: boolean;
attributeMap: Record<string, string>;
}
export function EventTooltipContent({
eventName,
timeOffsetMs,
isError,
attributeMap,
}: EventTooltipContentProps): JSX.Element {
const { time, timeUnitName } = convertTimeToRelevantUnit(timeOffsetMs);
return (
<div className="event-tooltip-content">
<div className="event-tooltip-content__header">
<Diamond size={10} />
<span>EVENT DETAILS</span>
</div>
<div className={`event-tooltip-content__name ${isError ? 'error' : ''}`}>
{eventName}
</div>
<div className="event-tooltip-content__time">
{toFixed(time, 2)} {timeUnitName} from start
</div>
{Object.keys(attributeMap).length > 0 && (
<>
<div className="event-tooltip-content__divider" />
<div className="event-tooltip-content__attributes">
{Object.entries(attributeMap).map(([key, value]) => (
<div key={key} className="event-tooltip-content__kv">
<span className="event-tooltip-content__key">{key}:</span>{' '}
<span className="event-tooltip-content__value">{value}</span>
</div>
))}
</div>
</>
)}
</div>
);
}

View File

@@ -0,0 +1,28 @@
.span-hover-card-popover {
.ant-popover-inner {
background-color: rgba(30, 30, 30, 0.95);
padding: 8px 12px;
border-radius: 4px;
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.3);
border: none;
}
.ant-popover-inner-content {
padding: 0;
}
}
.span-hover-card-content {
font-family: Inter, sans-serif;
font-size: 12px;
color: #fff;
&__name {
font-weight: 600;
margin-bottom: 4px;
}
&__row {
line-height: 1.5;
}
}

View File

@@ -0,0 +1,94 @@
import { ReactNode } from 'react';
import { Popover } from 'antd';
import { themeColors } from 'constants/theme';
import { convertTimeToRelevantUnit } from 'container/TraceDetail/utils';
import { generateColor } from 'lib/uPlotLib/utils/generateColor';
import { Span } from 'types/api/trace/getTraceV2';
import { toFixed } from 'utils/toFixed';
import './SpanHoverCard.styles.scss';
interface ITraceMetadata {
startTime: number;
endTime: number;
}
export interface SpanTooltipContentProps {
spanName: string;
color: string;
hasError: boolean;
relativeStartMs: number;
durationMs: number;
}
export function SpanTooltipContent({
spanName,
color,
hasError,
relativeStartMs,
durationMs,
}: SpanTooltipContentProps): JSX.Element {
const { time: formattedDuration, timeUnitName } = convertTimeToRelevantUnit(
durationMs,
);
return (
<div className="span-hover-card-content">
<div className="span-hover-card-content__name" style={{ color }}>
{spanName}
</div>
<div className="span-hover-card-content__row">
Status: {hasError ? 'error' : 'ok'}
</div>
<div className="span-hover-card-content__row">
Start: {toFixed(relativeStartMs, 2)} ms
</div>
<div className="span-hover-card-content__row">
Duration: {toFixed(formattedDuration, 2)} {timeUnitName}
</div>
</div>
);
}
interface SpanHoverCardProps {
span: Span;
traceMetadata: ITraceMetadata;
children: ReactNode;
}
function SpanHoverCard({
span,
traceMetadata,
children,
}: SpanHoverCardProps): JSX.Element {
const durationMs = span.durationNano / 1e6;
const relativeStartMs = span.timestamp - traceMetadata.startTime;
let color = generateColor(span.serviceName, themeColors.traceDetailColorsV3);
if (span.hasError) {
color = 'var(--bg-cherry-500)';
}
return (
<Popover
mouseEnterDelay={0.2}
content={
<SpanTooltipContent
spanName={span.name}
color={color}
hasError={span.hasError}
relativeStartMs={relativeStartMs}
durationMs={durationMs}
/>
}
trigger="hover"
rootClassName="span-hover-card-popover"
autoAdjustOverflow
arrow={false}
>
{children}
</Popover>
);
}
export default SpanHoverCard;

View File

@@ -0,0 +1,250 @@
import React, { useCallback, useEffect, useRef, useState } from 'react';
import { createPortal } from 'react-dom';
import TimelineV3 from 'components/TimelineV3/TimelineV3';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { EventTooltipContent } from '../SpanHoverCard/EventTooltipContent';
import { SpanTooltipContent } from '../SpanHoverCard/SpanHoverCard';
import { DEFAULT_ROW_HEIGHT } from './constants';
import { useCanvasSetup } from './hooks/useCanvasSetup';
import { useFlamegraphDrag } from './hooks/useFlamegraphDrag';
import { useFlamegraphDraw } from './hooks/useFlamegraphDraw';
import { useFlamegraphHover } from './hooks/useFlamegraphHover';
import { useFlamegraphZoom } from './hooks/useFlamegraphZoom';
import { useScrollToSpan } from './hooks/useScrollToSpan';
import { useVisualLayoutWorker } from './hooks/useVisualLayoutWorker';
import { EventRect, FlamegraphCanvasProps, SpanRect } from './types';
function FlamegraphCanvas(props: FlamegraphCanvasProps): JSX.Element {
const { spans, traceMetadata, firstSpanAtFetchLevel, onSpanClick } = props;
const isDarkMode = useIsDarkMode(); //TODO: see if can be removed or use a new hook
const canvasRef = useRef<HTMLCanvasElement>(null);
const containerRef = useRef<HTMLDivElement>(null);
const spanRectsRef = useRef<SpanRect[]>([]);
const eventRectsRef = useRef<EventRect[]>([]);
const [viewStartTs, setViewStartTs] = useState<number>(
traceMetadata.startTime,
);
const [viewEndTs, setViewEndTs] = useState<number>(traceMetadata.endTime);
const [scrollTop, setScrollTop] = useState<number>(0);
const [rowHeight, setRowHeight] = useState<number>(DEFAULT_ROW_HEIGHT);
// Mutable refs for zoom and drag hooks to read during rAF / mouse callbacks
const viewStartRef = useRef(viewStartTs);
const viewEndRef = useRef(viewEndTs);
const rowHeightRef = useRef(rowHeight);
const scrollTopRef = useRef(scrollTop);
useEffect(() => {
viewStartRef.current = viewStartTs;
}, [viewStartTs]);
useEffect(() => {
viewEndRef.current = viewEndTs;
}, [viewEndTs]);
useEffect(() => {
rowHeightRef.current = rowHeight;
}, [rowHeight]);
useEffect(() => {
scrollTopRef.current = scrollTop;
}, [scrollTop]);
useEffect(() => {
//TODO: see if this can be removed as once loaded the view start and end ts will not change
setViewStartTs(traceMetadata.startTime);
setViewEndTs(traceMetadata.endTime);
viewStartRef.current = traceMetadata.startTime;
viewEndRef.current = traceMetadata.endTime;
}, [traceMetadata.startTime, traceMetadata.endTime]);
const { layout, isComputing: _isComputing } = useVisualLayoutWorker(spans);
const totalHeight = layout.totalVisualRows * rowHeight;
const { isOverFlamegraphRef } = useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
});
const {
handleMouseDown,
handleMouseMove: handleDragMouseMove,
handleMouseUp,
handleDragMouseLeave,
isDraggingRef,
} = useFlamegraphDrag({
canvasRef,
containerRef,
traceMetadata,
viewStartRef,
viewEndRef,
setViewStartTs,
setViewEndTs,
scrollTopRef,
setScrollTop,
totalHeight,
});
const {
hoveredSpanId,
hoveredEventKey,
handleHoverMouseMove,
handleHoverMouseLeave,
handleMouseDownForClick,
handleClick,
tooltipContent,
} = useFlamegraphHover({
canvasRef,
spanRectsRef,
eventRectsRef,
traceMetadata,
viewStartTs,
viewEndTs,
isDraggingRef,
onSpanClick,
isDarkMode,
});
const { drawFlamegraph } = useFlamegraphDraw({
canvasRef,
containerRef,
spans: layout.visualRows,
connectors: layout.connectors,
viewStartTs,
viewEndTs,
scrollTop,
rowHeight,
selectedSpanId: firstSpanAtFetchLevel || undefined,
hoveredSpanId: hoveredSpanId ?? '',
isDarkMode,
spanRectsRef,
eventRectsRef,
hoveredEventKey,
});
useScrollToSpan({
firstSpanAtFetchLevel,
spans: layout.visualRows,
traceMetadata,
containerRef,
viewStartRef,
viewEndRef,
scrollTopRef,
rowHeight,
setViewStartTs,
setViewEndTs,
setScrollTop,
});
useCanvasSetup(canvasRef, containerRef, drawFlamegraph);
const handleMouseMove = useCallback(
(e: React.MouseEvent): void => {
handleDragMouseMove(e);
handleHoverMouseMove(e);
},
[handleDragMouseMove, handleHoverMouseMove],
);
const handleMouseLeave = useCallback((): void => {
isOverFlamegraphRef.current = false;
handleDragMouseLeave();
handleHoverMouseLeave();
}, [isOverFlamegraphRef, handleDragMouseLeave, handleHoverMouseLeave]);
const tooltipElement = tooltipContent
? createPortal(
<div
className="span-hover-card-popover"
style={{
position: 'fixed',
left: Math.min(tooltipContent.clientX + 15, window.innerWidth - 220),
top: Math.min(tooltipContent.clientY + 15, window.innerHeight - 100),
zIndex: 1000,
backgroundColor: 'rgba(30, 30, 30, 0.95)',
padding: '8px 12px',
borderRadius: 4,
boxShadow: '0 2px 8px rgba(0,0,0,0.3)',
pointerEvents: 'none',
}}
>
{tooltipContent.event ? (
<EventTooltipContent
eventName={tooltipContent.event.name}
timeOffsetMs={tooltipContent.event.timeOffsetMs}
isError={tooltipContent.event.isError}
attributeMap={tooltipContent.event.attributeMap}
/>
) : (
<SpanTooltipContent
spanName={tooltipContent.spanName}
color={tooltipContent.spanColor}
hasError={tooltipContent.status === 'error'}
relativeStartMs={tooltipContent.startMs}
durationMs={tooltipContent.durationMs}
/>
)}
</div>,
document.body,
)
: null;
return (
<div
style={{
display: 'flex',
flexDirection: 'column',
height: '100%',
padding: '0 15px',
}}
>
{tooltipElement}
<TimelineV3
startTimestamp={viewStartTs}
endTimestamp={viewEndTs}
offsetTimestamp={viewStartTs - traceMetadata.startTime}
timelineHeight={10}
/>
<div
ref={containerRef}
style={{
flex: 1,
overflow: 'hidden',
position: 'relative',
}}
onMouseEnter={(): void => {
isOverFlamegraphRef.current = true;
}}
onMouseLeave={handleMouseLeave}
>
<canvas
ref={canvasRef}
style={{
display: 'block',
width: '100%',
cursor: 'grab',
}}
onMouseDown={(e): void => {
handleMouseDown(e);
handleMouseDownForClick(e);
}}
onMouseMove={handleMouseMove}
onMouseUp={handleMouseUp}
onClick={handleClick}
/>
</div>
</div>
);
}
export default FlamegraphCanvas;

View File

@@ -0,0 +1,108 @@
import { useCallback, useEffect, useMemo, useState } from 'react';
import { useHistory, useLocation, useParams } from 'react-router-dom';
import useGetTraceFlamegraph from 'hooks/trace/useGetTraceFlamegraph';
import useUrlQuery from 'hooks/useUrlQuery';
import { TraceDetailFlamegraphURLProps } from 'types/api/trace/getTraceFlamegraph';
import FlamegraphCanvas from './FlamegraphCanvas';
//TODO: analyse if this is needed or not and move to separate file if needed else delete this enum.
enum TraceFlamegraphState {
LOADING = 'LOADING',
SUCCESS = 'SUCCESS',
NO_DATA = 'NO_DATA',
ERROR = 'ERROR',
FETCHING_WITH_OLD_DATA = 'FETCHING_WITH_OLD_DATA',
}
function TraceFlamegraph(): JSX.Element {
const { id: traceId } = useParams<TraceDetailFlamegraphURLProps>();
const urlQuery = useUrlQuery();
const history = useHistory();
const { search } = useLocation();
const [firstSpanAtFetchLevel, setFirstSpanAtFetchLevel] = useState<string>(
urlQuery.get('spanId') || '',
);
useEffect(() => {
setFirstSpanAtFetchLevel(urlQuery.get('spanId') || '');
}, [urlQuery]);
const handleSpanClick = useCallback(
(spanId: string): void => {
setFirstSpanAtFetchLevel(spanId);
const searchParams = new URLSearchParams(search);
//tood: use from query params constants
if (searchParams.get('spanId') !== spanId) {
searchParams.set('spanId', spanId);
history.replace({ search: searchParams.toString() });
}
},
[history, search],
);
const { data, isFetching, error } = useGetTraceFlamegraph({
traceId,
// selectedSpanId: firstSpanAtFetchLevel,
limit: 120000,
});
const flamegraphState = useMemo(() => {
if (isFetching) {
if (data?.payload?.spans && data.payload.spans.length > 0) {
return TraceFlamegraphState.FETCHING_WITH_OLD_DATA;
}
return TraceFlamegraphState.LOADING;
}
if (error) {
return TraceFlamegraphState.ERROR;
}
if (data?.payload?.spans && data.payload.spans.length === 0) {
return TraceFlamegraphState.NO_DATA;
}
return TraceFlamegraphState.SUCCESS;
}, [error, isFetching, data]);
const spans = useMemo(() => data?.payload?.spans || [], [
data?.payload?.spans,
]);
const content = useMemo(() => {
switch (flamegraphState) {
case TraceFlamegraphState.LOADING:
return <div>Loading...</div>;
case TraceFlamegraphState.ERROR:
return <div>Error loading flamegraph</div>;
case TraceFlamegraphState.NO_DATA:
return <div>No data found for trace {traceId}</div>;
case TraceFlamegraphState.SUCCESS:
case TraceFlamegraphState.FETCHING_WITH_OLD_DATA:
return (
<FlamegraphCanvas
spans={spans}
firstSpanAtFetchLevel={firstSpanAtFetchLevel}
setFirstSpanAtFetchLevel={setFirstSpanAtFetchLevel}
onSpanClick={handleSpanClick}
traceMetadata={{
startTime: data?.payload?.startTimestampMillis || 0,
endTime: data?.payload?.endTimestampMillis || 0,
}}
/>
);
default:
return <div>Fetching the trace...</div>;
}
}, [
data?.payload?.endTimestampMillis,
data?.payload?.startTimestampMillis,
firstSpanAtFetchLevel,
flamegraphState,
spans,
traceId,
handleSpanClick,
]);
return <>{content}</>;
}
export default TraceFlamegraph;

View File

@@ -0,0 +1,475 @@
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import { computeVisualLayout } from '../computeVisualLayout';
function makeSpan(
overrides: Partial<FlamegraphSpan> & {
spanId: string;
timestamp: number;
durationNano: number;
},
): FlamegraphSpan {
return {
parentSpanId: '',
traceId: 'trace-1',
hasError: false,
serviceName: 'svc',
name: 'op',
level: 0,
event: [],
...overrides,
};
}
describe('computeVisualLayout', () => {
it('should handle empty input', () => {
const layout = computeVisualLayout([]);
expect(layout.totalVisualRows).toBe(0);
expect(layout.visualRows).toEqual([]);
});
it('should handle single root, no children — 1 visual row', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 100e6,
});
const layout = computeVisualLayout([[root]]);
expect(layout.totalVisualRows).toBe(1);
expect(layout.visualRows[0]).toEqual([root]);
expect(layout.spanToVisualRow['root']).toBe(0);
});
it('should keep non-overlapping siblings on the same row (compact)', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 500e6,
});
const a = makeSpan({
spanId: 'a',
parentSpanId: 'root',
timestamp: 0,
durationNano: 100e6,
});
const b = makeSpan({
spanId: 'b',
parentSpanId: 'root',
timestamp: 200,
durationNano: 100e6,
});
const c = makeSpan({
spanId: 'c',
parentSpanId: 'root',
timestamp: 400,
durationNano: 100e6,
});
const layout = computeVisualLayout([[root], [a, b, c]]);
// root on row 0, all children on row 1
expect(layout.totalVisualRows).toBe(2);
expect(layout.spanToVisualRow['root']).toBe(0);
expect(layout.spanToVisualRow['a']).toBe(1);
expect(layout.spanToVisualRow['b']).toBe(1);
expect(layout.spanToVisualRow['c']).toBe(1);
});
it('should pack non-overlapping siblings into shared lanes (greedy packing)', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 300e6,
});
// A and B overlap; C does not overlap with either
const a = makeSpan({
spanId: 'a',
parentSpanId: 'root',
timestamp: 0,
durationNano: 100e6, // ends at 100ms
});
const b = makeSpan({
spanId: 'b',
parentSpanId: 'root',
timestamp: 50,
durationNano: 100e6, // starts at 50ms < 100ms end of A → overlap → lane 1
});
const c = makeSpan({
spanId: 'c',
parentSpanId: 'root',
timestamp: 200,
durationNano: 100e6, // 200 >= 100, fits lane 0 with A
});
const layout = computeVisualLayout([[root], [a, b, c]]);
// root on row 0, C placed first (latest) → row 1, B doesn't overlap C → row 1, A overlaps B → row 2
expect(layout.totalVisualRows).toBe(3);
expect(layout.spanToVisualRow['root']).toBe(0);
expect(layout.spanToVisualRow['c']).toBe(1);
expect(layout.spanToVisualRow['b']).toBe(1);
expect(layout.spanToVisualRow['a']).toBe(2);
});
it('should handle full overlap — all siblings get own row', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 200e6,
});
const a = makeSpan({
spanId: 'a',
parentSpanId: 'root',
timestamp: 0,
durationNano: 200e6,
});
const b = makeSpan({
spanId: 'b',
parentSpanId: 'root',
timestamp: 0,
durationNano: 200e6,
});
const layout = computeVisualLayout([[root], [a, b]]);
expect(layout.totalVisualRows).toBe(3);
expect(layout.spanToVisualRow['a']).toBe(1);
expect(layout.spanToVisualRow['b']).toBe(2);
});
it('should stack children correctly below overlapping parents', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 300e6,
});
const a = makeSpan({
spanId: 'a',
parentSpanId: 'root',
timestamp: 0,
durationNano: 200e6,
});
const b = makeSpan({
spanId: 'b',
parentSpanId: 'root',
timestamp: 50,
durationNano: 200e6,
});
// Child of A
const childA = makeSpan({
spanId: 'childA',
parentSpanId: 'a',
timestamp: 10,
durationNano: 50e6,
});
// Child of B
const childB = makeSpan({
spanId: 'childB',
parentSpanId: 'b',
timestamp: 60,
durationNano: 50e6,
});
const layout = computeVisualLayout([[root], [a, b], [childA, childB]]);
// DFS processes b's subtree first (latest):
// root → row 0
// b → row 1 (parentRow 0 + 1)
// childB → row 2 (parentRow 1 + 1)
// a → try row 1 (parentRow 0 + 1), overlaps b → try row 2, overlaps childB → row 3
// childA → row 4 (parentRow 3 + 1)
expect(layout.spanToVisualRow['root']).toBe(0);
expect(layout.spanToVisualRow['b']).toBe(1);
expect(layout.spanToVisualRow['childB']).toBe(2);
expect(layout.spanToVisualRow['a']).toBe(3);
expect(layout.spanToVisualRow['childA']).toBe(4);
expect(layout.totalVisualRows).toBe(5);
});
it('should handle multiple roots as a sibling group', () => {
// Two overlapping roots
const r1 = makeSpan({
spanId: 'r1',
timestamp: 0,
durationNano: 100e6,
});
const r2 = makeSpan({
spanId: 'r2',
timestamp: 50,
durationNano: 100e6,
});
const layout = computeVisualLayout([[r1, r2]]);
expect(layout.spanToVisualRow['r1']).toBe(0);
expect(layout.spanToVisualRow['r2']).toBe(1);
expect(layout.totalVisualRows).toBe(2);
});
it('should produce compact layout for deep nesting without overlap', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 1000e6,
});
const child = makeSpan({
spanId: 'child',
parentSpanId: 'root',
timestamp: 10,
durationNano: 500e6,
});
const grandchild = makeSpan({
spanId: 'grandchild',
parentSpanId: 'child',
timestamp: 20,
durationNano: 200e6,
});
const layout = computeVisualLayout([[root], [child], [grandchild]]);
// No overlap at any level → visual rows == tree depth
expect(layout.totalVisualRows).toBe(3);
expect(layout.spanToVisualRow['root']).toBe(0);
expect(layout.spanToVisualRow['child']).toBe(1);
expect(layout.spanToVisualRow['grandchild']).toBe(2);
});
it('should pack many sequential siblings into 1 row (no diagonal staircase)', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 500e6,
});
// 6 sequential children — like checkoutservice/PlaceOrder scenario
const spans = [
makeSpan({
spanId: 's1',
parentSpanId: 'root',
timestamp: 3,
durationNano: 30e6,
}),
makeSpan({
spanId: 's2',
parentSpanId: 'root',
timestamp: 35,
durationNano: 4e6,
}),
makeSpan({
spanId: 's3',
parentSpanId: 'root',
timestamp: 39,
durationNano: 1e6,
}),
makeSpan({
spanId: 's4',
parentSpanId: 'root',
timestamp: 40,
durationNano: 4e6,
}),
makeSpan({
spanId: 's5',
parentSpanId: 'root',
timestamp: 44,
durationNano: 5e6,
}),
makeSpan({
spanId: 's6',
parentSpanId: 'root',
timestamp: 49,
durationNano: 1e6,
}),
];
const layout = computeVisualLayout([[root], spans]);
// All 6 sequential siblings should share 1 row
expect(layout.totalVisualRows).toBe(2);
expect(layout.spanToVisualRow['root']).toBe(0);
for (const span of spans) {
expect(layout.spanToVisualRow[span.spanId]).toBe(1);
}
});
it('should keep children below parents even with misparented spans', () => {
// Simulates the dd_sig2 bug: /route spans have parentSpanId pointing
// to the wrong ancestor, but they are at level 2 in the spans[][] input.
// Level-based packing must place them below level 1 regardless.
const httpGet = makeSpan({
spanId: 'http-get',
timestamp: 0,
durationNano: 500e6,
});
const route = makeSpan({
spanId: 'route',
parentSpanId: 'some-wrong-ancestor', // misparented!
timestamp: 10,
durationNano: 200e6,
});
const layout = computeVisualLayout([[httpGet], [route]]);
// httpGet at level 0 → row 0, route at level 1 → row 1
expect(layout.spanToVisualRow['http-get']).toBe(0);
expect(layout.spanToVisualRow['route']).toBe(1);
expect(layout.totalVisualRows).toBe(2);
});
it('should keep parent-child pairs adjacent when sibling subtrees overlap', () => {
// Multiple overlapping parents each with a child — the subtree-unit
// guarantee means every parent→child gap should be exactly 1.
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 500e6,
});
// Three overlapping HTTP GET children of root, each with its own /route child
const get1 = makeSpan({
spanId: 'get1',
parentSpanId: 'root',
timestamp: 0,
durationNano: 200e6,
});
const route1 = makeSpan({
spanId: 'route1',
parentSpanId: 'get1',
timestamp: 10,
durationNano: 180e6,
});
const get2 = makeSpan({
spanId: 'get2',
parentSpanId: 'root',
timestamp: 50,
durationNano: 200e6,
});
const route2 = makeSpan({
spanId: 'route2',
parentSpanId: 'get2',
timestamp: 60,
durationNano: 180e6,
});
const get3 = makeSpan({
spanId: 'get3',
parentSpanId: 'root',
timestamp: 100,
durationNano: 200e6,
});
const route3 = makeSpan({
spanId: 'route3',
parentSpanId: 'get3',
timestamp: 110,
durationNano: 180e6,
});
const layout = computeVisualLayout([
[root],
[get1, get2, get3],
[route1, route2, route3],
]);
// Each parent-child pair should have a gap of exactly 1
const get1Row = layout.spanToVisualRow['get1'];
const route1Row = layout.spanToVisualRow['route1'];
const get2Row = layout.spanToVisualRow['get2'];
const route2Row = layout.spanToVisualRow['route2'];
const get3Row = layout.spanToVisualRow['get3'];
const route3Row = layout.spanToVisualRow['route3'];
expect(route1Row - get1Row).toBe(1);
expect(route2Row - get2Row).toBe(1);
expect(route3Row - get3Row).toBe(1);
});
it('should handle mixed levels — overlap at level 2 but not level 1', () => {
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 1000e6,
});
// Non-overlapping children
const a = makeSpan({
spanId: 'a',
parentSpanId: 'root',
timestamp: 0,
durationNano: 400e6,
});
const b = makeSpan({
spanId: 'b',
parentSpanId: 'root',
timestamp: 500,
durationNano: 400e6,
});
// Overlapping grandchildren under A
const ga1 = makeSpan({
spanId: 'ga1',
parentSpanId: 'a',
timestamp: 0,
durationNano: 200e6,
});
const ga2 = makeSpan({
spanId: 'ga2',
parentSpanId: 'a',
timestamp: 100,
durationNano: 200e6,
});
const layout = computeVisualLayout([[root], [a, b], [ga1, ga2]]);
// root → row 0
// a, b → row 1 (no overlap, share row)
// ga1 → row 2, ga2 → row 3 (overlap, expanded)
// b has no children, so nothing after ga2
expect(layout.spanToVisualRow['root']).toBe(0);
expect(layout.spanToVisualRow['a']).toBe(1);
expect(layout.spanToVisualRow['b']).toBe(1);
expect(layout.spanToVisualRow['ga2']).toBe(2);
expect(layout.spanToVisualRow['ga1']).toBe(3);
expect(layout.totalVisualRows).toBe(4);
});
it('should not place a span where it covers an existing connector point (Check 2)', () => {
// Scenario: root has 3 leaf children. Sorted latest-first: C(200), B(100), A(80).
//
// C placed at row 1 [200, 400].
// B overlaps C → placed at row 2 [100, 300]. Connector from row 0→2 at x=100
// passes through row 1, recording connector point at (row 1, x=100).
// A [80, 110] does NOT overlap C's span [200, 400] at row 1 (110 < 200),
// so without connector reservation A would fit at row 1.
// But A's span [80, 110) contains the connector point x=100 at row 1.
// Check 2 prevents this placement, pushing A further down.
const root = makeSpan({
spanId: 'root',
timestamp: 0,
durationNano: 500e6,
});
const c = makeSpan({
spanId: 'c',
parentSpanId: 'root',
timestamp: 200,
durationNano: 200e6, // [200, 400]
});
const b = makeSpan({
spanId: 'b',
parentSpanId: 'root',
timestamp: 100,
durationNano: 200e6, // [100, 300]
});
const a = makeSpan({
spanId: 'a',
parentSpanId: 'root',
timestamp: 80,
durationNano: 30e6, // [80, 110]
});
const layout = computeVisualLayout([[root], [a, b, c]]);
expect(layout.spanToVisualRow['root']).toBe(0);
expect(layout.spanToVisualRow['c']).toBe(1); // latest, placed first
expect(layout.spanToVisualRow['b']).toBe(2); // overlaps C → row 2
// A would fit at row 1 by span overlap alone, but connector point at
// (row 1, x=100) falls within A's span [80, 110). Check 2 pushes A down.
const aRow = layout.spanToVisualRow['a']!;
expect(aRow).toBeGreaterThan(1); // must NOT be at row 1
expect(aRow).toBe(3); // next free row after B at row 2 (A overlaps B)
});
});

View File

@@ -0,0 +1,539 @@
import { DASHED_BORDER_LINE_DASH, MIN_WIDTH_FOR_NAME } from '../constants';
import type { FlamegraphRowMetrics } from '../utils';
import { getFlamegraphRowMetrics } from '../utils';
import { drawEventDot, drawSpanBar, getEventDotColor } from '../utils';
import { MOCK_SPAN } from './testUtils';
jest.mock('container/TraceDetail/utils', () => ({
convertTimeToRelevantUnit: (): { time: number; timeUnitName: string } => ({
time: 50,
timeUnitName: 'ms',
}),
}));
/** Minimal 2D context for createStripePattern's internal canvas (jsdom getContext often returns null) */
const mockPatternCanvasCtx = {
beginPath: jest.fn(),
moveTo: jest.fn(),
lineTo: jest.fn(),
stroke: jest.fn(),
globalAlpha: 1,
};
const originalCreateElement = document.createElement.bind(document);
document.createElement = function (
tagName: string,
): ReturnType<typeof originalCreateElement> {
const el = originalCreateElement(tagName);
if (tagName.toLowerCase() === 'canvas') {
(el as HTMLCanvasElement).getContext = (() =>
mockPatternCanvasCtx as unknown) as HTMLCanvasElement['getContext'];
}
return el;
};
function createMockCtx(): jest.Mocked<CanvasRenderingContext2D> {
return ({
beginPath: jest.fn(),
roundRect: jest.fn(),
fill: jest.fn(),
stroke: jest.fn(),
save: jest.fn(),
restore: jest.fn(),
translate: jest.fn(),
rotate: jest.fn(),
fillRect: jest.fn(),
strokeRect: jest.fn(),
setLineDash: jest.fn(),
measureText: jest.fn(
(text: string) => ({ width: text.length * 6 } as TextMetrics),
),
createPattern: jest.fn(() => ({} as CanvasPattern)),
clip: jest.fn(),
rect: jest.fn(),
fillText: jest.fn(),
font: '',
fillStyle: '',
strokeStyle: '',
textAlign: '',
textBaseline: '',
lineWidth: 0,
globalAlpha: 1,
} as unknown) as jest.Mocked<CanvasRenderingContext2D>;
}
const METRICS: FlamegraphRowMetrics = getFlamegraphRowMetrics(24);
describe('Canvas Draw Utils', () => {
describe('drawSpanBar', () => {
it('draws rect + fill for normal span (no selected/hovered)', () => {
const ctx = createMockCtx();
const spanRectsArray: {
span: typeof MOCK_SPAN;
x: number;
y: number;
width: number;
height: number;
level: number;
}[] = [];
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, event: [] },
x: 10,
y: 0,
width: 100,
levelIndex: 0,
spanRectsArray,
eventRectsArray: [],
color: '#1890ff',
isDarkMode: false,
metrics: METRICS,
});
expect(ctx.beginPath).toHaveBeenCalled();
expect(ctx.roundRect).toHaveBeenCalledWith(10, 1, 100, 22, 2);
expect(ctx.fill).toHaveBeenCalled();
expect(ctx.stroke).not.toHaveBeenCalled();
expect(spanRectsArray).toHaveLength(1);
expect(spanRectsArray[0]).toMatchObject({
x: 10,
y: 1,
width: 100,
height: 22,
level: 0,
});
});
it('uses stripe pattern + dashed stroke + 2px when selected', () => {
const ctx = createMockCtx();
const spanRectsArray: {
span: typeof MOCK_SPAN;
x: number;
y: number;
width: number;
height: number;
level: number;
}[] = [];
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, spanId: 'sel', event: [] },
x: 20,
y: 0,
width: 80,
levelIndex: 1,
spanRectsArray,
eventRectsArray: [],
color: '#2F80ED',
isDarkMode: false,
metrics: METRICS,
selectedSpanId: 'sel',
});
expect(ctx.createPattern).toHaveBeenCalled();
expect(ctx.setLineDash).toHaveBeenCalledWith(DASHED_BORDER_LINE_DASH);
expect(ctx.strokeStyle).toBe('#2F80ED');
expect(ctx.lineWidth).toBe(2);
expect(ctx.stroke).toHaveBeenCalled();
expect(ctx.setLineDash).toHaveBeenLastCalledWith([]);
});
it('uses stripe pattern + solid stroke + 1px when hovered (not selected)', () => {
const ctx = createMockCtx();
const spanRectsArray: {
span: typeof MOCK_SPAN;
x: number;
y: number;
width: number;
height: number;
level: number;
}[] = [];
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, spanId: 'hov', event: [] },
x: 30,
y: 0,
width: 60,
levelIndex: 0,
spanRectsArray,
eventRectsArray: [],
color: '#2F80ED',
isDarkMode: false,
metrics: METRICS,
hoveredSpanId: 'hov',
});
expect(ctx.createPattern).toHaveBeenCalled();
expect(ctx.setLineDash).not.toHaveBeenCalled();
expect(ctx.lineWidth).toBe(1);
expect(ctx.stroke).toHaveBeenCalled();
});
it('pushes spanRectsArray with correct dimensions', () => {
const ctx = createMockCtx();
const spanRectsArray: {
span: typeof MOCK_SPAN;
x: number;
y: number;
width: number;
height: number;
level: number;
}[] = [];
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, spanId: 'rect-test', event: [] },
x: 5,
y: 24,
width: 200,
levelIndex: 2,
spanRectsArray,
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
expect(spanRectsArray[0]).toMatchObject({
x: 5,
y: 25,
width: 200,
height: 22,
level: 2,
});
expect(spanRectsArray[0].span.spanId).toBe('rect-test');
});
});
describe('drawSpanLabel (via drawSpanBar)', () => {
it('skips label when width < MIN_WIDTH_FOR_NAME', () => {
const ctx = createMockCtx();
const spanRectsArray: {
span: typeof MOCK_SPAN;
x: number;
y: number;
width: number;
height: number;
level: number;
}[] = [];
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, name: 'long-span-name', event: [] },
x: 0,
y: 0,
width: MIN_WIDTH_FOR_NAME - 1,
levelIndex: 0,
spanRectsArray,
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
expect(ctx.clip).not.toHaveBeenCalled();
expect(ctx.fillText).not.toHaveBeenCalled();
});
it('draws name only when width >= MIN_WIDTH_FOR_NAME but < MIN_WIDTH_FOR_NAME_AND_DURATION', () => {
const ctx = createMockCtx();
ctx.measureText = jest.fn(
(t: string) => ({ width: t.length * 6 } as TextMetrics),
);
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, name: 'foo', event: [] },
x: 0,
y: 0,
width: 50,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
expect(ctx.clip).toHaveBeenCalled();
expect(ctx.fillText).toHaveBeenCalled();
expect(ctx.textAlign).toBe('left');
});
it('draws name + duration when width >= MIN_WIDTH_FOR_NAME_AND_DURATION', () => {
const ctx = createMockCtx();
ctx.measureText = jest.fn(
(t: string) => ({ width: t.length * 6 } as TextMetrics),
);
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, name: 'my-span', event: [] },
x: 0,
y: 0,
width: 100,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
expect(ctx.fillText).toHaveBeenCalledTimes(2);
expect(ctx.fillText).toHaveBeenCalledWith(
'50ms',
expect.any(Number),
expect.any(Number),
);
expect(ctx.fillText).toHaveBeenCalledWith(
'my-span',
expect.any(Number),
expect.any(Number),
);
});
});
describe('truncateText (via drawSpanBar)', () => {
it('uses full text when it fits', () => {
const ctx = createMockCtx();
ctx.measureText = jest.fn(
(t: string) => ({ width: t.length * 4 } as TextMetrics),
);
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, name: 'short', event: [] },
x: 0,
y: 0,
width: 100,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
expect(ctx.fillText).toHaveBeenCalledWith(
'short',
expect.any(Number),
expect.any(Number),
);
});
it('truncates text when it exceeds available width', () => {
const ctx = createMockCtx();
ctx.measureText = jest.fn(
(t: string) =>
({
width: t.includes('...') ? 24 : t.length * 10,
} as TextMetrics),
);
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, name: 'very-long-span-name', event: [] },
x: 0,
y: 0,
width: 50,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
const fillTextCalls = (ctx.fillText as jest.Mock).mock.calls;
const nameArg = fillTextCalls.find((c) => c[0] !== '50ms')?.[0];
expect(nameArg).toBeDefined();
expect(nameArg).toMatch(/\.\.\.$/);
});
});
describe('drawEventDot', () => {
it('uses error styling when isError is true', () => {
const ctx = createMockCtx();
const color = getEventDotColor('#000', true, false);
drawEventDot({
ctx,
x: 50,
y: 11,
color,
eventDotSize: 6,
});
expect(ctx.save).toHaveBeenCalled();
expect(ctx.translate).toHaveBeenCalledWith(50, 11);
expect(ctx.rotate).toHaveBeenCalledWith(Math.PI / 4);
expect(ctx.fillStyle).toBe('rgb(220, 38, 38)');
expect(ctx.strokeStyle).toBe('rgb(153, 27, 27)');
expect(ctx.fillRect).toHaveBeenCalledWith(-3, -3, 6, 6);
expect(ctx.strokeRect).toHaveBeenCalledWith(-3, -3, 6, 6);
expect(ctx.restore).toHaveBeenCalled();
});
it('derives color from span color when isError is false', () => {
const ctx = createMockCtx();
const color = getEventDotColor('rgb(100, 200, 150)', false, false);
drawEventDot({
ctx,
x: 0,
y: 0,
color,
eventDotSize: 6,
});
// Darkened by 20% for fill
expect(ctx.fillStyle).toBe('rgb(80, 160, 120)');
// Darkened by 40% for stroke
expect(ctx.strokeStyle).toBe('rgb(60, 120, 90)');
});
it('uses dark mode colors for error', () => {
const ctx = createMockCtx();
const color = getEventDotColor('#000', true, true);
drawEventDot({
ctx,
x: 0,
y: 0,
color,
eventDotSize: 6,
});
expect(ctx.fillStyle).toBe('rgb(239, 68, 68)');
expect(ctx.strokeStyle).toBe('rgb(185, 28, 28)');
});
it('falls back to cyan/blue for unparseable span colors', () => {
const ctx = createMockCtx();
const color = getEventDotColor('hsl(200, 50%, 50%)', false, false);
drawEventDot({
ctx,
x: 0,
y: 0,
color,
eventDotSize: 6,
});
expect(ctx.fillStyle).toBe('rgb(6, 182, 212)');
expect(ctx.strokeStyle).toBe('rgb(8, 145, 178)');
});
it('calls save, translate, rotate, restore', () => {
const ctx = createMockCtx();
const color = getEventDotColor('#000', false, false);
drawEventDot({
ctx,
x: 10,
y: 20,
color,
eventDotSize: 4,
});
expect(ctx.save).toHaveBeenCalled();
expect(ctx.translate).toHaveBeenCalledWith(10, 20);
expect(ctx.rotate).toHaveBeenCalledWith(Math.PI / 4);
expect(ctx.restore).toHaveBeenCalled();
});
});
describe('createStripePattern (via drawSpanBar)', () => {
it('uses pattern when createPattern returns non-null', () => {
const ctx = createMockCtx();
const mockPattern = {} as CanvasPattern;
(ctx.createPattern as jest.Mock).mockReturnValue(mockPattern);
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, spanId: 'p', event: [] },
x: 0,
y: 0,
width: MIN_WIDTH_FOR_NAME - 1,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
hoveredSpanId: 'p',
});
expect(ctx.createPattern).toHaveBeenCalled();
expect(ctx.fillStyle).toBe(mockPattern);
expect(ctx.fill).toHaveBeenCalled();
});
it('skips fill when createPattern returns null', () => {
const ctx = createMockCtx();
(ctx.createPattern as jest.Mock).mockReturnValue(null);
drawSpanBar({
ctx,
span: { ...MOCK_SPAN, spanId: 'p', event: [] },
x: 0,
y: 0,
width: MIN_WIDTH_FOR_NAME - 1,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
selectedSpanId: 'p',
});
expect(ctx.fill).not.toHaveBeenCalled();
expect(ctx.stroke).toHaveBeenCalled();
});
});
describe('drawSpanBar with events', () => {
it('draws event dots for each span event', () => {
const ctx = createMockCtx();
const spanWithEvents = {
...MOCK_SPAN,
event: [
{
name: 'e1',
timeUnixNano: 1_010_000_000,
attributeMap: {},
isError: false,
},
{
name: 'e2',
timeUnixNano: 1_025_000_000,
attributeMap: {},
isError: true,
},
],
};
drawSpanBar({
ctx,
span: spanWithEvents,
x: 0,
y: 0,
width: 100,
levelIndex: 0,
spanRectsArray: [],
eventRectsArray: [],
color: '#000',
isDarkMode: false,
metrics: METRICS,
});
expect(ctx.save).toHaveBeenCalledTimes(3);
expect(ctx.translate).toHaveBeenCalledTimes(2);
expect(ctx.fillRect).toHaveBeenCalledTimes(2);
});
});
});

View File

@@ -0,0 +1,54 @@
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
/** Minimal FlamegraphSpan for unit tests */
export const MOCK_SPAN: FlamegraphSpan = {
timestamp: 1000,
durationNano: 50_000_000, // 50ms
spanId: 'span-1',
parentSpanId: '',
traceId: 'trace-1',
hasError: false,
serviceName: 'test-service',
name: 'test-span',
level: 0,
event: [],
};
/** Nested spans structure for findSpanById tests */
export const MOCK_SPANS: FlamegraphSpan[][] = [
[
{
...MOCK_SPAN,
spanId: 'root',
parentSpanId: '',
level: 0,
},
],
[
{
...MOCK_SPAN,
spanId: 'child-a',
parentSpanId: 'root',
level: 1,
},
{
...MOCK_SPAN,
spanId: 'child-b',
parentSpanId: 'root',
level: 1,
},
],
[
{
...MOCK_SPAN,
spanId: 'grandchild',
parentSpanId: 'child-a',
level: 2,
},
],
];
export const MOCK_TRACE_METADATA = {
startTime: 0,
endTime: 1000,
};

View File

@@ -0,0 +1,144 @@
import React from 'react';
import { act, renderHook } from '@testing-library/react';
import { useFlamegraphDrag } from '../hooks/useFlamegraphDrag';
import { MOCK_TRACE_METADATA } from './testUtils';
function createMockCanvas(): HTMLCanvasElement {
const canvas = document.createElement('canvas');
canvas.getBoundingClientRect = jest.fn(
(): DOMRect =>
({
left: 0,
top: 0,
width: 800,
height: 400,
x: 0,
y: 0,
bottom: 400,
right: 800,
toJSON: (): Record<string, unknown> => ({}),
} as DOMRect),
);
return canvas;
}
function createMockContainer(): HTMLDivElement {
const div = document.createElement('div');
Object.defineProperty(div, 'clientHeight', { value: 400 });
return div;
}
const defaultArgs = {
canvasRef: { current: createMockCanvas() },
containerRef: { current: createMockContainer() },
traceMetadata: MOCK_TRACE_METADATA,
viewStartRef: { current: 0 },
viewEndRef: { current: 1000 },
setViewStartTs: jest.fn(),
setViewEndTs: jest.fn(),
scrollTopRef: { current: 0 },
setScrollTop: jest.fn(),
totalHeight: 1000,
};
describe('useFlamegraphDrag', () => {
beforeEach(() => {
jest.clearAllMocks();
defaultArgs.viewStartRef.current = 0;
defaultArgs.viewEndRef.current = 1000;
defaultArgs.scrollTopRef.current = 0;
});
it('starts drag state on mousedown', () => {
const { result } = renderHook(() => useFlamegraphDrag(defaultArgs));
act(() => {
result.current.handleMouseDown(({
button: 0,
clientX: 100,
clientY: 50,
preventDefault: jest.fn(),
} as unknown) as React.MouseEvent);
});
expect(result.current.isDraggingRef.current).toBe(true);
});
it('ignores non-left button mousedown', () => {
const { result } = renderHook(() => useFlamegraphDrag(defaultArgs));
act(() => {
result.current.handleMouseDown(({
button: 1,
clientX: 100,
clientY: 50,
preventDefault: jest.fn(),
} as unknown) as React.MouseEvent);
});
expect(result.current.isDraggingRef.current).toBe(false);
});
it('updates pan/scroll on mousemove', () => {
const { result } = renderHook(() => useFlamegraphDrag(defaultArgs));
act(() => {
result.current.handleMouseDown(({
button: 0,
clientX: 100,
clientY: 50,
preventDefault: jest.fn(),
} as unknown) as React.MouseEvent);
});
act(() => {
result.current.handleMouseMove(({
clientX: 150,
clientY: 100,
} as unknown) as React.MouseEvent);
});
expect(defaultArgs.setViewStartTs).toHaveBeenCalled();
expect(defaultArgs.setViewEndTs).toHaveBeenCalled();
expect(defaultArgs.setScrollTop).toHaveBeenCalled();
});
it('resets drag state on mouseup', () => {
const { result } = renderHook(() => useFlamegraphDrag(defaultArgs));
act(() => {
result.current.handleMouseDown(({
button: 0,
clientX: 100,
clientY: 50,
preventDefault: jest.fn(),
} as unknown) as React.MouseEvent);
});
act(() => {
result.current.handleMouseUp();
});
expect(result.current.isDraggingRef.current).toBe(false);
});
it('cancels drag on mouseleave', () => {
const { result } = renderHook(() => useFlamegraphDrag(defaultArgs));
act(() => {
result.current.handleMouseDown(({
button: 0,
clientX: 100,
clientY: 50,
preventDefault: jest.fn(),
} as unknown) as React.MouseEvent);
});
act(() => {
result.current.handleDragMouseLeave();
});
expect(result.current.isDraggingRef.current).toBe(false);
});
});

View File

@@ -0,0 +1,179 @@
import type React from 'react';
import { act, renderHook } from '@testing-library/react';
import { useFlamegraphHover } from '../hooks/useFlamegraphHover';
import type { SpanRect } from '../types';
import { MOCK_SPAN, MOCK_TRACE_METADATA } from './testUtils';
function createMockCanvas(): HTMLCanvasElement {
const canvas = document.createElement('canvas');
canvas.width = 800;
canvas.height = 400;
canvas.getBoundingClientRect = jest.fn(
(): DOMRect =>
({
left: 0,
top: 0,
width: 800,
height: 400,
x: 0,
y: 0,
bottom: 400,
right: 800,
toJSON: (): Record<string, unknown> => ({}),
} as DOMRect),
);
return canvas;
}
const spanRect: SpanRect = {
span: { ...MOCK_SPAN, spanId: 'hover-span', name: 'test-span' },
x: 100,
y: 50,
width: 200,
height: 22,
level: 0,
};
const defaultArgs = {
canvasRef: { current: createMockCanvas() },
spanRectsRef: { current: [spanRect] },
eventRectsRef: { current: [] as any[] },
traceMetadata: MOCK_TRACE_METADATA,
viewStartTs: MOCK_TRACE_METADATA.startTime,
viewEndTs: MOCK_TRACE_METADATA.endTime,
isDraggingRef: { current: false },
onSpanClick: jest.fn(),
isDarkMode: false,
};
describe('useFlamegraphHover', () => {
beforeEach(() => {
Object.defineProperty(window, 'devicePixelRatio', {
configurable: true,
value: 1,
});
jest.clearAllMocks();
defaultArgs.spanRectsRef.current = [spanRect];
defaultArgs.isDraggingRef.current = false;
});
it('sets hoveredSpanId and tooltipContent when hovering on span', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
act(() => {
result.current.handleHoverMouseMove({
clientX: 150,
clientY: 61,
} as React.MouseEvent);
});
expect(result.current.hoveredSpanId).toBe('hover-span');
expect(result.current.tooltipContent).not.toBeNull();
expect(result.current.tooltipContent?.spanName).toBe('test-span');
expect(result.current.tooltipContent?.clientX).toBe(150);
expect(result.current.tooltipContent?.clientY).toBe(61);
});
it('clears hover when moving to empty area', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
act(() => {
result.current.handleHoverMouseMove({
clientX: 150,
clientY: 61,
} as React.MouseEvent);
});
expect(result.current.hoveredSpanId).toBe('hover-span');
act(() => {
result.current.handleHoverMouseMove({
clientX: 10,
clientY: 10,
} as React.MouseEvent);
});
expect(result.current.hoveredSpanId).toBeNull();
expect(result.current.tooltipContent).toBeNull();
});
it('clears hover on mouse leave', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
act(() => {
result.current.handleHoverMouseMove({
clientX: 150,
clientY: 61,
} as React.MouseEvent);
});
act(() => {
result.current.handleHoverMouseLeave();
});
expect(result.current.hoveredSpanId).toBeNull();
expect(result.current.tooltipContent).toBeNull();
});
it('suppresses click when drag distance exceeds threshold', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
act(() => {
result.current.handleMouseDownForClick({
clientX: 100,
clientY: 50,
} as React.MouseEvent);
});
act(() => {
result.current.handleClick({
clientX: 150,
clientY: 100,
} as React.MouseEvent);
});
expect(defaultArgs.onSpanClick).not.toHaveBeenCalled();
});
it('calls onSpanClick when clicking on span', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
act(() => {
result.current.handleClick({
clientX: 150,
clientY: 61,
} as React.MouseEvent);
});
expect(defaultArgs.onSpanClick).toHaveBeenCalledWith('hover-span');
});
it('uses clientX/clientY for tooltip positioning', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
act(() => {
result.current.handleHoverMouseMove({
clientX: 200,
clientY: 60,
} as React.MouseEvent);
});
expect(result.current.tooltipContent?.clientX).toBe(200);
expect(result.current.tooltipContent?.clientY).toBe(60);
});
it('does not update hover during drag', () => {
const { result } = renderHook(() => useFlamegraphHover(defaultArgs));
defaultArgs.isDraggingRef.current = true;
act(() => {
result.current.handleHoverMouseMove({
clientX: 150,
clientY: 61,
} as React.MouseEvent);
});
expect(result.current.hoveredSpanId).toBeNull();
});
});

View File

@@ -0,0 +1,279 @@
import { act, renderHook } from '@testing-library/react';
import { DEFAULT_ROW_HEIGHT, MIN_VISIBLE_SPAN_MS } from '../constants';
import { useFlamegraphZoom } from '../hooks/useFlamegraphZoom';
import { MOCK_TRACE_METADATA } from './testUtils';
function createMockCanvas(): HTMLCanvasElement {
const canvas = document.createElement('canvas');
canvas.width = 800;
canvas.height = 400;
canvas.getBoundingClientRect = jest.fn(
(): DOMRect =>
({
left: 0,
top: 0,
width: 800,
height: 400,
x: 0,
y: 0,
bottom: 400,
right: 800,
toJSON: (): Record<string, unknown> => ({}),
} as DOMRect),
);
return canvas;
}
describe('useFlamegraphZoom', () => {
const traceMetadata = { ...MOCK_TRACE_METADATA };
beforeEach(() => {
Object.defineProperty(window, 'devicePixelRatio', {
configurable: true,
value: 1,
});
});
it('handleResetZoom restores traceMetadata.startTime/endTime', () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setRowHeight = jest.fn();
const viewStartRef = { current: 100 };
const viewEndRef = { current: 500 };
const rowHeightRef = { current: 30 };
const canvasRef = { current: createMockCanvas() };
const { result } = renderHook(() =>
useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
}),
);
act(() => {
result.current.handleResetZoom();
});
expect(setViewStartTs).toHaveBeenCalledWith(traceMetadata.startTime);
expect(setViewEndTs).toHaveBeenCalledWith(traceMetadata.endTime);
expect(setRowHeight).toHaveBeenCalledWith(DEFAULT_ROW_HEIGHT);
expect(viewStartRef.current).toBe(traceMetadata.startTime);
expect(viewEndRef.current).toBe(traceMetadata.endTime);
expect(rowHeightRef.current).toBe(DEFAULT_ROW_HEIGHT);
});
it('wheel zoom in decreases visible time range', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setRowHeight = jest.fn();
const viewStartRef = { current: traceMetadata.startTime };
const viewEndRef = { current: traceMetadata.endTime };
const rowHeightRef = { current: DEFAULT_ROW_HEIGHT };
const canvas = createMockCanvas();
const canvasRef = { current: canvas };
renderHook(() =>
useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
}),
);
const initialSpan = viewEndRef.current - viewStartRef.current;
await act(async () => {
canvas.dispatchEvent(
new WheelEvent('wheel', {
clientX: 400,
deltaY: -100,
bubbles: true,
}),
);
});
await act(async () => {
await new Promise((r) => requestAnimationFrame(r));
});
expect(setViewStartTs).toHaveBeenCalled();
expect(setViewEndTs).toHaveBeenCalled();
const [newStart] = setViewStartTs.mock.calls[0] ?? [];
const [newEnd] = setViewEndTs.mock.calls[0] ?? [];
if (newStart != null && newEnd != null) {
const newSpan = newEnd - newStart;
expect(newSpan).toBeLessThan(initialSpan);
}
});
it('wheel zoom out increases visible time range', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setRowHeight = jest.fn();
const halfSpan = (traceMetadata.endTime - traceMetadata.startTime) / 2;
const viewStartRef = { current: traceMetadata.startTime + halfSpan * 0.25 };
const viewEndRef = { current: traceMetadata.startTime + halfSpan * 0.75 };
const rowHeightRef = { current: DEFAULT_ROW_HEIGHT };
const canvas = createMockCanvas();
const canvasRef = { current: canvas };
renderHook(() =>
useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
}),
);
const initialSpan = viewEndRef.current - viewStartRef.current;
await act(async () => {
canvas.dispatchEvent(
new WheelEvent('wheel', {
clientX: 400,
deltaY: 100,
bubbles: true,
}),
);
});
await act(async () => {
await new Promise((r) => requestAnimationFrame(r));
});
expect(setViewStartTs).toHaveBeenCalled();
expect(setViewEndTs).toHaveBeenCalled();
const [newStart] = setViewStartTs.mock.calls[0] ?? [];
const [newEnd] = setViewEndTs.mock.calls[0] ?? [];
if (newStart != null && newEnd != null) {
const newSpan = newEnd - newStart;
expect(newSpan).toBeGreaterThanOrEqual(initialSpan);
}
});
it('clamps zoom to MIN_VISIBLE_SPAN_MS', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setRowHeight = jest.fn();
const viewStartRef = { current: traceMetadata.startTime };
const viewEndRef = { current: traceMetadata.startTime + 100 };
const rowHeightRef = { current: DEFAULT_ROW_HEIGHT };
const canvas = createMockCanvas();
const canvasRef = { current: canvas };
renderHook(() =>
useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
}),
);
await act(async () => {
canvas.dispatchEvent(
new WheelEvent('wheel', {
clientX: 400,
deltaY: 10000,
bubbles: true,
}),
);
});
await act(async () => {
await new Promise((r) => requestAnimationFrame(r));
});
const [newStart] = setViewStartTs.mock.calls[0] ?? [];
const [newEnd] = setViewEndTs.mock.calls[0] ?? [];
if (newStart != null && newEnd != null) {
const newSpan = newEnd - newStart;
expect(newSpan).toBeGreaterThanOrEqual(MIN_VISIBLE_SPAN_MS);
}
});
it('clamps viewStart/viewEnd to trace bounds', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setRowHeight = jest.fn();
const viewStartRef = { current: traceMetadata.startTime };
const viewEndRef = { current: traceMetadata.endTime };
const rowHeightRef = { current: DEFAULT_ROW_HEIGHT };
const canvas = createMockCanvas();
const canvasRef = { current: canvas };
renderHook(() =>
useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
}),
);
await act(async () => {
canvas.dispatchEvent(
new WheelEvent('wheel', {
clientX: 400,
deltaY: -5000,
bubbles: true,
}),
);
});
await act(async () => {
await new Promise((r) => requestAnimationFrame(r));
});
const [newStart] = setViewStartTs.mock.calls[0] ?? [];
const [newEnd] = setViewEndTs.mock.calls[0] ?? [];
if (newStart != null && newEnd != null) {
expect(newStart).toBeGreaterThanOrEqual(traceMetadata.startTime);
expect(newEnd).toBeLessThanOrEqual(traceMetadata.endTime);
}
});
it('returns isOverFlamegraphRef', () => {
const canvasRef = { current: createMockCanvas() };
const { result } = renderHook(() =>
useFlamegraphZoom({
canvasRef,
traceMetadata,
viewStartRef: { current: 0 },
viewEndRef: { current: 1000 },
rowHeightRef: { current: 24 },
setViewStartTs: jest.fn(),
setViewEndTs: jest.fn(),
setRowHeight: jest.fn(),
}),
);
expect(result.current.isOverFlamegraphRef).toBeDefined();
expect(result.current.isOverFlamegraphRef.current).toBe(false);
});
});

View File

@@ -0,0 +1,212 @@
import type { Dispatch, SetStateAction } from 'react';
import { useRef } from 'react';
import { act, render, waitFor } from '@testing-library/react';
import { useScrollToSpan } from '../hooks/useScrollToSpan';
import { MOCK_SPANS, MOCK_TRACE_METADATA } from './testUtils';
function TestWrapper({
firstSpanAtFetchLevel,
spans,
traceMetadata,
setViewStartTs,
setViewEndTs,
setScrollTop,
}: {
firstSpanAtFetchLevel: string;
spans: typeof MOCK_SPANS;
traceMetadata: typeof MOCK_TRACE_METADATA;
setViewStartTs: Dispatch<SetStateAction<number>>;
setViewEndTs: Dispatch<SetStateAction<number>>;
setScrollTop: Dispatch<SetStateAction<number>>;
}): JSX.Element {
const containerRef = useRef<HTMLDivElement>(null);
const viewStartRef = useRef(traceMetadata.startTime);
const viewEndRef = useRef(traceMetadata.endTime);
const scrollTopRef = useRef(0);
useScrollToSpan({
firstSpanAtFetchLevel,
spans,
traceMetadata,
containerRef,
viewStartRef,
viewEndRef,
scrollTopRef,
rowHeight: 24,
setViewStartTs,
setViewEndTs,
setScrollTop,
});
return <div ref={containerRef} data-testid="container" />;
}
describe('useScrollToSpan', () => {
beforeEach(() => {
Object.defineProperty(HTMLElement.prototype, 'clientHeight', {
configurable: true,
value: 400,
});
});
it('does not update when firstSpanAtFetchLevel is empty', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setScrollTop = jest.fn();
render(
<TestWrapper
firstSpanAtFetchLevel=""
spans={MOCK_SPANS}
traceMetadata={MOCK_TRACE_METADATA}
setViewStartTs={setViewStartTs}
setViewEndTs={setViewEndTs}
setScrollTop={setScrollTop}
/>,
);
await waitFor(() => {
expect(setViewStartTs).not.toHaveBeenCalled();
expect(setViewEndTs).not.toHaveBeenCalled();
expect(setScrollTop).not.toHaveBeenCalled();
});
});
it('does not update when spans are empty', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setScrollTop = jest.fn();
render(
<TestWrapper
firstSpanAtFetchLevel="root"
spans={[]}
traceMetadata={MOCK_TRACE_METADATA}
setViewStartTs={setViewStartTs}
setViewEndTs={setViewEndTs}
setScrollTop={setScrollTop}
/>,
);
await waitFor(() => {
expect(setViewStartTs).not.toHaveBeenCalled();
expect(setViewEndTs).not.toHaveBeenCalled();
expect(setScrollTop).not.toHaveBeenCalled();
});
});
it('does not update when target span not found', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setScrollTop = jest.fn();
render(
<TestWrapper
firstSpanAtFetchLevel="nonexistent"
spans={MOCK_SPANS}
traceMetadata={MOCK_TRACE_METADATA}
setViewStartTs={setViewStartTs}
setViewEndTs={setViewEndTs}
setScrollTop={setScrollTop}
/>,
);
await waitFor(() => {
expect(setViewStartTs).not.toHaveBeenCalled();
expect(setViewEndTs).not.toHaveBeenCalled();
expect(setScrollTop).not.toHaveBeenCalled();
});
});
it('calls setters when target span found', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
const setScrollTop = jest.fn();
const { getByTestId } = render(
<TestWrapper
firstSpanAtFetchLevel="grandchild"
spans={MOCK_SPANS}
traceMetadata={MOCK_TRACE_METADATA}
setViewStartTs={setViewStartTs}
setViewEndTs={setViewEndTs}
setScrollTop={setScrollTop}
/>,
);
expect(getByTestId('container')).toBeInTheDocument();
await waitFor(() => {
expect(setViewStartTs).toHaveBeenCalled();
expect(setViewEndTs).toHaveBeenCalled();
expect(setScrollTop).toHaveBeenCalled();
});
const [viewStart] = setViewStartTs.mock.calls[0];
const [viewEnd] = setViewEndTs.mock.calls[0];
const [scrollTop] = setScrollTop.mock.calls[0];
expect(viewEnd - viewStart).toBeGreaterThan(0);
expect(viewStart).toBeGreaterThanOrEqual(MOCK_TRACE_METADATA.startTime);
expect(viewEnd).toBeLessThanOrEqual(MOCK_TRACE_METADATA.endTime);
expect(scrollTop).toBeGreaterThanOrEqual(0);
});
it('centers span vertically (scrollTop centers span row)', async () => {
const setScrollTop = jest.fn();
await act(async () => {
render(
<TestWrapper
firstSpanAtFetchLevel="grandchild"
spans={MOCK_SPANS}
traceMetadata={MOCK_TRACE_METADATA}
setViewStartTs={jest.fn()}
setViewEndTs={jest.fn()}
setScrollTop={setScrollTop}
/>,
);
});
await waitFor(() => expect(setScrollTop).toHaveBeenCalled());
const [scrollTop] = setScrollTop.mock.calls[0];
const levelIndex = 2;
const rowHeight = 24;
const viewportHeight = 400;
const expectedCenter =
levelIndex * rowHeight - viewportHeight / 2 + rowHeight / 2;
expect(scrollTop).toBeCloseTo(Math.max(0, expectedCenter), -1);
});
it('zooms horizontally to span with 2x duration padding', async () => {
const setViewStartTs = jest.fn();
const setViewEndTs = jest.fn();
await act(async () => {
render(
<TestWrapper
firstSpanAtFetchLevel="root"
spans={MOCK_SPANS}
traceMetadata={MOCK_TRACE_METADATA}
setViewStartTs={setViewStartTs}
setViewEndTs={setViewEndTs}
setScrollTop={jest.fn()}
/>,
);
});
await waitFor(() => {
expect(setViewStartTs).toHaveBeenCalled();
expect(setViewEndTs).toHaveBeenCalled();
});
const [viewStart] = setViewStartTs.mock.calls[0];
const [viewEnd] = setViewEndTs.mock.calls[0];
const visibleWindow = viewEnd - viewStart;
const rootSpan = MOCK_SPANS[0][0];
const spanDurationMs = rootSpan.durationNano / 1e6;
expect(visibleWindow).toBeGreaterThanOrEqual(Math.max(spanDurationMs * 2, 5));
});
});

View File

@@ -0,0 +1,135 @@
import {
clamp,
findSpanById,
formatDuration,
getFlamegraphRowMetrics,
} from '../utils';
import { MOCK_SPANS } from './testUtils';
jest.mock('container/TraceDetail/utils', () => ({
convertTimeToRelevantUnit: (
valueMs: number,
): { time: number; timeUnitName: string } => {
if (valueMs === 0) {
return { time: 0, timeUnitName: 'ms' };
}
if (valueMs < 1) {
return { time: valueMs, timeUnitName: 'ms' };
}
if (valueMs < 1000) {
return { time: valueMs, timeUnitName: 'ms' };
}
if (valueMs < 60_000) {
return { time: valueMs / 1000, timeUnitName: 's' };
}
if (valueMs < 3_600_000) {
return { time: valueMs / 60_000, timeUnitName: 'm' };
}
return { time: valueMs / 3_600_000, timeUnitName: 'hr' };
},
}));
describe('Pure Math and Data Utils', () => {
describe('clamp', () => {
it('returns value when within range', () => {
expect(clamp(5, 0, 10)).toBe(5);
expect(clamp(-3, -5, 5)).toBe(-3);
});
it('returns min when value is below min', () => {
expect(clamp(-1, 0, 10)).toBe(0);
expect(clamp(2, 5, 10)).toBe(5);
});
it('returns max when value is above max', () => {
expect(clamp(11, 0, 10)).toBe(10);
expect(clamp(100, 0, 50)).toBe(50);
});
it('handles min === max', () => {
expect(clamp(5, 7, 7)).toBe(7);
expect(clamp(7, 7, 7)).toBe(7);
});
});
describe('findSpanById', () => {
it('finds span in first level', () => {
const result = findSpanById(MOCK_SPANS, 'root');
expect(result).not.toBeNull();
expect(result?.span.spanId).toBe('root');
expect(result?.levelIndex).toBe(0);
});
it('finds span in nested level', () => {
const result = findSpanById(MOCK_SPANS, 'grandchild');
expect(result).not.toBeNull();
expect(result?.span.spanId).toBe('grandchild');
expect(result?.levelIndex).toBe(2);
});
it('returns null when span not found', () => {
expect(findSpanById(MOCK_SPANS, 'nonexistent')).toBeNull();
});
it('handles empty spans', () => {
expect(findSpanById([], 'root')).toBeNull();
expect(findSpanById([[], []], 'root')).toBeNull();
});
});
describe('getFlamegraphRowMetrics', () => {
it('computes normal row height metrics (24px)', () => {
const m = getFlamegraphRowMetrics(24);
expect(m.ROW_HEIGHT).toBe(24);
expect(m.SPAN_BAR_HEIGHT).toBe(22);
expect(m.SPAN_BAR_Y_OFFSET).toBe(1);
expect(m.EVENT_DOT_SIZE).toBe(6);
});
it('clamps span bar height to max for large row heights', () => {
const m = getFlamegraphRowMetrics(100);
expect(m.SPAN_BAR_HEIGHT).toBe(22);
expect(m.SPAN_BAR_Y_OFFSET).toBe(39);
});
it('clamps span bar height to min for small row heights', () => {
const m = getFlamegraphRowMetrics(6);
expect(m.SPAN_BAR_HEIGHT).toBe(8);
// spanBarYOffset = floor((6-8)/2) = -1 when bar exceeds row height
expect(m.SPAN_BAR_Y_OFFSET).toBe(-1);
});
it('clamps event dot size within min/max', () => {
const mSmall = getFlamegraphRowMetrics(6);
expect(mSmall.EVENT_DOT_SIZE).toBe(4);
const mLarge = getFlamegraphRowMetrics(24);
expect(mLarge.EVENT_DOT_SIZE).toBe(6);
});
});
describe('formatDuration', () => {
it('formats nanos as ms', () => {
// 1e6 nanos = 1ms
expect(formatDuration(1_000_000)).toBe('1ms');
});
it('formats larger durations as s/m/hr', () => {
// 2e9 nanos = 2000ms = 2s
expect(formatDuration(2_000_000_000)).toBe('2s');
});
it('formats zero duration', () => {
expect(formatDuration(0)).toBe('0ms');
});
it('formats very small values', () => {
// 1000 nanos = 0.001ms → mock returns { time: 0.001, timeUnitName: 'ms' }
expect(formatDuration(1000)).toBe('0ms');
});
it('formats decimal seconds correctly', () => {
expect(formatDuration(1_500_000_000)).toBe('1.5s');
});
});
});

View File

@@ -0,0 +1,67 @@
import { getSpanColor } from '../utils';
import { MOCK_SPAN } from './testUtils';
const mockGenerateColor = jest.fn();
jest.mock('lib/uPlotLib/utils/generateColor', () => ({
generateColor: (key: string, colorMap: Record<string, string>): string =>
mockGenerateColor(key, colorMap),
}));
describe('Presentation / Styling Utils', () => {
beforeEach(() => {
jest.clearAllMocks();
mockGenerateColor.mockReturnValue('#2F80ED');
});
describe('getSpanColor', () => {
it('uses generated service color for normal span', () => {
mockGenerateColor.mockReturnValue('#1890ff');
const color = getSpanColor({
span: { ...MOCK_SPAN, hasError: false },
isDarkMode: false,
});
expect(mockGenerateColor).toHaveBeenCalledWith(
MOCK_SPAN.serviceName,
expect.any(Object),
);
expect(color).toBe('#1890ff');
});
it('overrides with error color in light mode when span has error', () => {
mockGenerateColor.mockReturnValue('#1890ff');
const color = getSpanColor({
span: { ...MOCK_SPAN, hasError: true },
isDarkMode: false,
});
expect(color).toBe('rgb(220, 38, 38)');
});
it('overrides with error color in dark mode when span has error', () => {
mockGenerateColor.mockReturnValue('#1890ff');
const color = getSpanColor({
span: { ...MOCK_SPAN, hasError: true },
isDarkMode: true,
});
expect(color).toBe('rgb(239, 68, 68)');
});
it('passes serviceName to generateColor', () => {
getSpanColor({
span: { ...MOCK_SPAN, serviceName: 'my-service' },
isDarkMode: false,
});
expect(mockGenerateColor).toHaveBeenCalledWith(
'my-service',
expect.any(Object),
);
});
});
});

View File

@@ -0,0 +1,370 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
export interface ConnectorLine {
parentRow: number;
childRow: number;
timestampMs: number;
serviceName: string;
}
export interface VisualLayout {
visualRows: FlamegraphSpan[][];
spanToVisualRow: Record<string, number>;
connectors: ConnectorLine[];
totalVisualRows: number;
}
/**
* Computes an overlap-safe visual layout for flamegraph spans using DFS ordering.
*
* Builds a parent→children tree from parentSpanId, then traverses in DFS pre-order.
* Each span is placed at parentRow+1 if free, otherwise scans upward row-by-row
* until finding a non-overlapping row. This keeps children visually close to their
* parents and avoids the BFS problem where distant siblings push children far down.
*/
export function computeVisualLayout(spans: FlamegraphSpan[][]): VisualLayout {
const spanToVisualRow = new Map<string, number>();
const visualRowsMap = new Map<number, FlamegraphSpan[]>();
let maxRow = -1;
// Per-row interval list for overlap detection
// Each entry: [startTime, endTime]
const rowIntervals = new Map<number, Array<[number, number]>>();
// function hasOverlap(row: number, startTime: number, endTime: number): boolean {
// const intervals = rowIntervals.get(row);
// if (!intervals) {
// return false;
// }
// for (const [s, e] of intervals) {
// if (startTime < e && endTime > s) {
// return true;
// }
// }
// return false;
// }
function addToRow(row: number, span: FlamegraphSpan): void {
spanToVisualRow.set(span.spanId, row);
let arr = visualRowsMap.get(row);
if (!arr) {
arr = [];
visualRowsMap.set(row, arr);
}
arr.push(span);
const startTime = span.timestamp;
const endTime = span.timestamp + span.durationNano / 1e6;
let intervals = rowIntervals.get(row);
if (!intervals) {
intervals = [];
rowIntervals.set(row, intervals);
}
intervals.push([startTime, endTime]);
if (row > maxRow) {
maxRow = row;
}
}
// Flatten all spans and build lookup + children map
const spanMap = new Map<string, FlamegraphSpan>();
const childrenMap = new Map<string, FlamegraphSpan[]>();
const allSpans: FlamegraphSpan[] = [];
for (const level of spans) {
for (const span of level) {
allSpans.push(span);
spanMap.set(span.spanId, span);
}
}
// Extract parentSpanId — the field may be missing at runtime when the API
// returns `references` instead. Fall back to the first CHILD_OF reference.
function getParentId(span: FlamegraphSpan): string {
if (span.parentSpanId) {
return span.parentSpanId;
}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const refs = (span as any).references as
| Array<{ spanId?: string; refType?: string }>
| undefined;
if (refs) {
for (const ref of refs) {
if (ref.refType === 'CHILD_OF' && ref.spanId) {
return ref.spanId;
}
}
}
return '';
}
// Build children map and identify roots
const roots: FlamegraphSpan[] = [];
for (const span of allSpans) {
const parentId = getParentId(span);
if (!parentId || !spanMap.has(parentId)) {
roots.push(span);
} else {
let children = childrenMap.get(parentId);
if (!children) {
children = [];
childrenMap.set(parentId, children);
}
children.push(span);
}
}
// Sort children by timestamp for deterministic ordering
for (const [, children] of childrenMap) {
children.sort((a, b) => b.timestamp - a.timestamp);
}
// --- Subtree-unit placement ---
// Compute each subtree's layout in isolation, then place as a unit
// to guarantee parent-child adjacency within subtrees.
interface ShapeEntry {
span: FlamegraphSpan;
relativeRow: number;
}
function hasOverlapIn(
intervals: Map<number, Array<[number, number]>>,
row: number,
startTime: number,
endTime: number,
): boolean {
const rowIntervals = intervals.get(row);
if (!rowIntervals) {
return false;
}
for (const [s, e] of rowIntervals) {
if (startTime < e && endTime > s) {
return true;
}
}
return false;
}
function addIntervalTo(
intervals: Map<number, Array<[number, number]>>,
row: number,
startTime: number,
endTime: number,
): void {
let arr = intervals.get(row);
if (!arr) {
arr = [];
intervals.set(row, arr);
}
arr.push([startTime, endTime]);
}
function hasConnectorConflict(
intervals: Map<number, Array<[number, number]>>,
row: number,
point: number,
): boolean {
const rowIntervals = intervals.get(row);
if (!rowIntervals) {
return false;
}
for (const [s, e] of rowIntervals) {
if (point >= s && point < e) {
return true;
}
}
return false;
}
function hasPointInSpan(
connectorPoints: Map<number, number[]>,
row: number,
startTime: number,
endTime: number,
): boolean {
const points = connectorPoints.get(row);
if (!points) {
return false;
}
for (const p of points) {
if (p >= startTime && p < endTime) {
return true;
}
}
return false;
}
function addConnectorPoint(
connectorPoints: Map<number, number[]>,
row: number,
point: number,
): void {
let arr = connectorPoints.get(row);
if (!arr) {
arr = [];
connectorPoints.set(row, arr);
}
arr.push(point);
}
function computeSubtreeShape(rootSpan: FlamegraphSpan): ShapeEntry[] {
const localIntervals = new Map<number, Array<[number, number]>>();
const localConnectorPoints = new Map<number, number[]>();
const shape: ShapeEntry[] = [];
// Place root span at relative row 0
const rootStart = rootSpan.timestamp;
const rootEnd = rootSpan.timestamp + rootSpan.durationNano / 1e6;
shape.push({ span: rootSpan, relativeRow: 0 });
addIntervalTo(localIntervals, 0, rootStart, rootEnd);
const children = childrenMap.get(rootSpan.spanId);
if (children) {
for (const child of children) {
const childShape = computeSubtreeShape(child);
const connectorX = child.timestamp;
const offset = findPlacement(
childShape,
1,
localIntervals,
localConnectorPoints,
connectorX,
);
// Record connector points for intermediate rows (1 to offset-1)
for (let r = 1; r < offset; r++) {
addConnectorPoint(localConnectorPoints, r, connectorX);
}
// Place child shape into local state at offset
for (const entry of childShape) {
const actualRow = entry.relativeRow + offset;
shape.push({ span: entry.span, relativeRow: actualRow });
const s = entry.span.timestamp;
const e = entry.span.timestamp + entry.span.durationNano / 1e6;
addIntervalTo(localIntervals, actualRow, s, e);
}
}
}
return shape;
}
function findPlacement(
shape: ShapeEntry[],
minOffset: number,
intervals: Map<number, Array<[number, number]>>,
connectorPoints?: Map<number, number[]>,
connectorX?: number,
): number {
// Track the first offset that passes Checks 1 & 2 as a fallback.
// Check 3 (connector vs span) is monotonically failing: once it fails
// at offset K, all offsets > K also fail (more intermediate rows).
// If we can't satisfy Check 3, fall back to the best offset without it.
let fallbackOffset = -1;
for (let offset = minOffset; ; offset++) {
let passesSpanChecks = true;
// Check 1: span vs span (existing)
for (const entry of shape) {
const targetRow = entry.relativeRow + offset;
const s = entry.span.timestamp;
const e = entry.span.timestamp + entry.span.durationNano / 1e6;
if (hasOverlapIn(intervals, targetRow, s, e)) {
passesSpanChecks = false;
break;
}
}
// Check 2: span vs existing connector points
if (passesSpanChecks && connectorPoints) {
for (const entry of shape) {
const targetRow = entry.relativeRow + offset;
const s = entry.span.timestamp;
const e = entry.span.timestamp + entry.span.durationNano / 1e6;
if (hasPointInSpan(connectorPoints, targetRow, s, e)) {
passesSpanChecks = false;
break;
}
}
}
if (!passesSpanChecks) {
continue;
}
// This offset passes Checks 1 & 2 — record as fallback
if (fallbackOffset === -1) {
fallbackOffset = offset;
}
// Check 3: new connector vs existing spans
if (connectorX !== undefined) {
let connectorClear = true;
for (let r = 1; r < offset; r++) {
if (hasConnectorConflict(intervals, r, connectorX)) {
connectorClear = false;
break;
}
}
if (!connectorClear) {
// Check 3 will fail for all larger offsets too.
// Fall back to the first offset that passed Checks 1 & 2.
return fallbackOffset;
}
}
return offset;
}
}
// Process roots sorted by timestamp
roots.sort((a, b) => a.timestamp - b.timestamp);
for (const root of roots) {
const shape = computeSubtreeShape(root);
const offset = findPlacement(shape, 0, rowIntervals);
for (const entry of shape) {
addToRow(entry.relativeRow + offset, entry.span);
}
}
// Build the visualRows array
const totalVisualRows = maxRow + 1;
const visualRows: FlamegraphSpan[][] = [];
for (let i = 0; i < totalVisualRows; i++) {
visualRows.push(visualRowsMap.get(i) || []);
}
// Build connector lines for parent-child pairs with row gap > 1
const connectors: ConnectorLine[] = [];
for (const [parentId, children] of childrenMap) {
const parentRow = spanToVisualRow.get(parentId);
if (parentRow === undefined) {
continue;
}
for (const child of children) {
const childRow = spanToVisualRow.get(child.spanId);
if (childRow === undefined || childRow - parentRow <= 1) {
continue;
}
connectors.push({
parentRow,
childRow,
timestampMs: child.timestamp,
serviceName: child.serviceName,
});
}
}
return {
visualRows,
spanToVisualRow: Object.fromEntries(spanToVisualRow),
connectors,
totalVisualRows,
};
}

View File

@@ -0,0 +1,36 @@
export const ROW_HEIGHT = 24;
export const SPAN_BAR_HEIGHT = 22;
export const SPAN_BAR_Y_OFFSET = Math.floor((ROW_HEIGHT - SPAN_BAR_HEIGHT) / 2);
export const EVENT_DOT_SIZE = 6;
// Span bar sizing relative to row height (used by getFlamegraphRowMetrics)
export const SPAN_BAR_HEIGHT_RATIO = SPAN_BAR_HEIGHT / ROW_HEIGHT;
export const MIN_SPAN_BAR_HEIGHT = 8;
export const MAX_SPAN_BAR_HEIGHT = SPAN_BAR_HEIGHT;
// Event dot sizing relative to span bar height
export const EVENT_DOT_SIZE_RATIO = EVENT_DOT_SIZE / SPAN_BAR_HEIGHT;
export const MIN_EVENT_DOT_SIZE = 4;
export const MAX_EVENT_DOT_SIZE = EVENT_DOT_SIZE;
export const LABEL_FONT = '11px Inter, sans-serif';
export const LABEL_PADDING_X = 8;
export const MIN_WIDTH_FOR_NAME = 30;
export const MIN_WIDTH_FOR_NAME_AND_DURATION = 80;
// Dynamic row height (vertical zoom) -- disabled for now (MIN === MAX)
export const MIN_ROW_HEIGHT = 24;
export const MAX_ROW_HEIGHT = 24;
export const DEFAULT_ROW_HEIGHT = MIN_ROW_HEIGHT;
// Zoom intensity -- how fast zoom reacts to wheel/pinch delta
export const PINCH_ZOOM_INTENSITY_H = 0.01;
export const SCROLL_ZOOM_INTENSITY_H = 0.0015;
export const PINCH_ZOOM_INTENSITY_V = 0.008;
export const SCROLL_ZOOM_INTENSITY_V = 0.001;
// Minimum visible time span in ms (prevents zooming to sub-pixel)
export const MIN_VISIBLE_SPAN_MS = 5;
// Selected span style (dashed border)
export const DASHED_BORDER_LINE_DASH = [4, 2];

View File

@@ -0,0 +1,55 @@
import { RefObject, useCallback, useEffect } from 'react';
export function useCanvasSetup(
canvasRef: RefObject<HTMLCanvasElement>,
containerRef: RefObject<HTMLDivElement>,
onDraw: () => void,
): void {
const updateCanvasSize = useCallback(() => {
const canvas = canvasRef.current;
const container = containerRef.current;
if (!canvas || !container) {
return;
}
const dpr = window.devicePixelRatio || 1;
const rect = container.getBoundingClientRect();
const viewportHeight = container.clientHeight;
canvas.style.width = `${rect.width}px`;
canvas.style.height = `${viewportHeight}px`;
const newWidth = Math.floor(rect.width * dpr);
const newHeight = Math.floor(viewportHeight * dpr);
if (canvas.width !== newWidth || canvas.height !== newHeight) {
canvas.width = newWidth;
canvas.height = newHeight;
onDraw();
}
}, [canvasRef, containerRef, onDraw]);
useEffect(() => {
const container = containerRef.current;
if (!container) {
return (): void => {};
}
const resizeObserver = new ResizeObserver(updateCanvasSize);
resizeObserver.observe(container);
updateCanvasSize();
// when dpr changes, update the canvas size
const dprQuery = window.matchMedia('(resolution: 1dppx)');
dprQuery.addEventListener('change', updateCanvasSize);
return (): void => {
resizeObserver.disconnect();
dprQuery.removeEventListener('change', updateCanvasSize);
};
}, [containerRef, updateCanvasSize]);
useEffect(() => {
onDraw();
}, [onDraw]);
}

View File

@@ -0,0 +1,170 @@
import {
Dispatch,
MouseEvent as ReactMouseEvent,
MutableRefObject,
RefObject,
SetStateAction,
useCallback,
useRef,
} from 'react';
import { ITraceMetadata } from '../types';
import { clamp } from '../utils';
interface UseFlamegraphDragArgs {
canvasRef: RefObject<HTMLCanvasElement>;
containerRef: RefObject<HTMLDivElement>;
traceMetadata: ITraceMetadata;
viewStartRef: MutableRefObject<number>;
viewEndRef: MutableRefObject<number>;
setViewStartTs: Dispatch<SetStateAction<number>>;
setViewEndTs: Dispatch<SetStateAction<number>>;
scrollTopRef: MutableRefObject<number>;
setScrollTop: Dispatch<SetStateAction<number>>;
totalHeight: number;
}
interface UseFlamegraphDragResult {
handleMouseDown: (e: ReactMouseEvent) => void;
handleMouseMove: (e: ReactMouseEvent) => void;
handleMouseUp: () => void;
handleDragMouseLeave: () => void;
isDraggingRef: MutableRefObject<boolean>;
}
export function useFlamegraphDrag(
args: UseFlamegraphDragArgs,
): UseFlamegraphDragResult {
const {
canvasRef,
containerRef,
traceMetadata,
viewStartRef,
viewEndRef,
setViewStartTs,
setViewEndTs,
scrollTopRef,
setScrollTop,
totalHeight,
} = args;
const isDraggingRef = useRef(false);
const dragStartRef = useRef<{ x: number; y: number } | null>(null);
const dragDistanceRef = useRef(0);
const clampScrollTop = useCallback(
(next: number): number => {
const container = containerRef.current;
if (!container) {
return 0;
}
const viewportHeight = container.clientHeight;
const maxScroll = Math.max(0, totalHeight - viewportHeight);
return clamp(next, 0, maxScroll);
},
[containerRef, totalHeight],
);
const handleMouseDown = useCallback(
(event: ReactMouseEvent): void => {
if (event.button !== 0) {
return;
}
event.preventDefault();
isDraggingRef.current = true;
dragStartRef.current = { x: event.clientX, y: event.clientY };
dragDistanceRef.current = 0;
const canvas = canvasRef.current;
if (canvas) {
canvas.style.cursor = 'grabbing';
}
},
[canvasRef],
);
const handleMouseMove = useCallback(
(event: ReactMouseEvent): void => {
if (!isDraggingRef.current || !dragStartRef.current) {
return;
}
const canvas = canvasRef.current;
if (!canvas) {
return;
}
const rect = canvas.getBoundingClientRect();
const deltaX = event.clientX - dragStartRef.current.x;
const deltaY = event.clientY - dragStartRef.current.y;
dragDistanceRef.current = Math.sqrt(deltaX * deltaX + deltaY * deltaY);
// --- Horizontal pan ---
const timeSpan = viewEndRef.current - viewStartRef.current;
const deltaTime = (deltaX / rect.width) * timeSpan;
const newStart = viewStartRef.current - deltaTime;
const clampedStart = clamp(
newStart,
traceMetadata.startTime,
traceMetadata.endTime - timeSpan,
);
const clampedEnd = clampedStart + timeSpan;
viewStartRef.current = clampedStart;
viewEndRef.current = clampedEnd;
setViewStartTs(clampedStart);
setViewEndTs(clampedEnd);
// --- Vertical scroll pan ---
const nextScrollTop = clampScrollTop(scrollTopRef.current - deltaY);
scrollTopRef.current = nextScrollTop;
setScrollTop(nextScrollTop);
dragStartRef.current = { x: event.clientX, y: event.clientY };
},
[
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
setViewStartTs,
setViewEndTs,
scrollTopRef,
setScrollTop,
clampScrollTop,
],
);
const handleMouseUp = useCallback((): void => {
isDraggingRef.current = false;
dragStartRef.current = null;
dragDistanceRef.current = 0;
const canvas = canvasRef.current;
if (canvas) {
canvas.style.cursor = 'grab';
}
}, [canvasRef]);
// const handleDragMouseLeave = useCallback((): void => {
// isDraggingRef.current = false;
// dragStartRef.current = null;
// dragDistanceRef.current = 0;
// const canvas = canvasRef.current;
// if (canvas) {
// canvas.style.cursor = 'grab';
// }
// }, [canvasRef]);
return {
handleMouseDown,
handleMouseMove,
handleMouseUp,
handleDragMouseLeave: handleMouseUp, // Same logic for mouse up and leaving the canvas
isDraggingRef,
};
}

View File

@@ -0,0 +1,317 @@
import React, { RefObject, useCallback, useRef } from 'react';
import { themeColors } from 'constants/theme';
import { generateColor } from 'lib/uPlotLib/utils/generateColor';
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import { ConnectorLine } from '../computeVisualLayout';
import { EventRect, SpanRect } from '../types';
import {
clamp,
drawSpanBar,
FlamegraphRowMetrics,
getFlamegraphRowMetrics,
getSpanColor,
} from '../utils';
interface UseFlamegraphDrawArgs {
canvasRef: RefObject<HTMLCanvasElement>;
containerRef: RefObject<HTMLDivElement>;
spans: FlamegraphSpan[][];
connectors: ConnectorLine[];
viewStartTs: number;
viewEndTs: number;
scrollTop: number;
rowHeight: number;
selectedSpanId: string | undefined;
hoveredSpanId: string;
isDarkMode: boolean;
spanRectsRef?: React.MutableRefObject<SpanRect[]>;
eventRectsRef?: React.MutableRefObject<EventRect[]>;
hoveredEventKey?: string | null;
}
interface UseFlamegraphDrawResult {
drawFlamegraph: () => void;
spanRectsRef: RefObject<SpanRect[]>;
eventRectsRef: RefObject<EventRect[]>;
}
const OVERSCAN_ROWS = 4;
interface DrawLevelArgs {
ctx: CanvasRenderingContext2D;
levelSpans: FlamegraphSpan[];
levelIndex: number;
y: number;
viewStartTs: number;
timeSpan: number;
cssWidth: number;
selectedSpanId: string | undefined;
hoveredSpanId: string;
isDarkMode: boolean;
spanRectsArray: SpanRect[];
eventRectsArray: EventRect[];
metrics: FlamegraphRowMetrics;
hoveredEventKey?: string | null;
}
function drawLevel(args: DrawLevelArgs): void {
const {
ctx,
levelSpans,
levelIndex,
y,
viewStartTs,
timeSpan,
cssWidth,
selectedSpanId,
hoveredSpanId,
isDarkMode,
spanRectsArray,
eventRectsArray,
metrics,
hoveredEventKey,
} = args;
const viewEndTs = viewStartTs + timeSpan;
for (let i = 0; i < levelSpans.length; i++) {
const span = levelSpans[i];
const spanStartMs = span.timestamp;
const spanEndMs = span.timestamp + span.durationNano / 1e6;
// Time culling -- skip spans entirely outside the visible time window
if (spanEndMs < viewStartTs || spanStartMs > viewEndTs) {
continue;
}
const leftOffset = ((spanStartMs - viewStartTs) / timeSpan) * cssWidth;
const rightEdge = ((spanEndMs - viewStartTs) / timeSpan) * cssWidth;
let width = rightEdge - leftOffset;
// Clamp to visible x-range
if (leftOffset < 0) {
width += leftOffset;
if (width <= 0) {
continue;
}
}
if (rightEdge > cssWidth) {
width = cssWidth - Math.max(0, leftOffset);
if (width <= 0) {
continue;
}
}
// Minimum 1px width so tiny spans remain visible
width = clamp(width, 1, Infinity);
const color = getSpanColor({ span, isDarkMode });
drawSpanBar({
ctx,
span,
x: Math.max(0, leftOffset),
y,
width,
levelIndex,
spanRectsArray,
eventRectsArray,
color,
isDarkMode,
metrics,
selectedSpanId,
hoveredSpanId,
hoveredEventKey,
});
}
}
interface DrawConnectorLinesArgs {
ctx: CanvasRenderingContext2D;
connectors: ConnectorLine[];
scrollTop: number;
viewStartTs: number;
timeSpan: number;
cssWidth: number;
viewportHeight: number;
metrics: FlamegraphRowMetrics;
}
function drawConnectorLines(args: DrawConnectorLinesArgs): void {
const {
ctx,
connectors,
scrollTop,
viewStartTs,
timeSpan,
cssWidth,
viewportHeight,
metrics,
} = args;
ctx.save();
ctx.lineWidth = 1;
ctx.globalAlpha = 0.6;
for (const conn of connectors) {
const xFrac = (conn.timestampMs - viewStartTs) / timeSpan;
if (xFrac < -0.01 || xFrac > 1.01) {
continue;
}
const parentY =
conn.parentRow * metrics.ROW_HEIGHT -
scrollTop +
metrics.SPAN_BAR_Y_OFFSET +
metrics.SPAN_BAR_HEIGHT;
const childY =
conn.childRow * metrics.ROW_HEIGHT - scrollTop + metrics.SPAN_BAR_Y_OFFSET;
// Skip if entirely outside viewport
if (parentY > viewportHeight || childY < 0) {
continue;
}
const color = generateColor(
conn.serviceName,
themeColors.traceDetailColorsV3,
);
ctx.strokeStyle = color;
const x = clamp(xFrac * cssWidth, 0, cssWidth);
ctx.beginPath();
ctx.moveTo(x, parentY);
ctx.lineTo(x, childY);
ctx.stroke();
}
ctx.restore();
}
export function useFlamegraphDraw(
args: UseFlamegraphDrawArgs,
): UseFlamegraphDrawResult {
const {
canvasRef,
containerRef,
spans,
connectors,
viewStartTs,
viewEndTs,
scrollTop,
rowHeight,
selectedSpanId,
hoveredSpanId,
isDarkMode,
spanRectsRef: spanRectsRefProp,
eventRectsRef: eventRectsRefProp,
hoveredEventKey,
} = args;
const spanRectsRefInternal = useRef<SpanRect[]>([]);
const spanRectsRef = spanRectsRefProp ?? spanRectsRefInternal;
const eventRectsRefInternal = useRef<EventRect[]>([]);
const eventRectsRef = eventRectsRefProp ?? eventRectsRefInternal;
const drawFlamegraph = useCallback(() => {
const canvas = canvasRef.current;
const container = containerRef.current;
if (!canvas || !container) {
return;
}
const ctx = canvas.getContext('2d');
if (!ctx) {
return;
}
const dpr = window.devicePixelRatio || 1;
ctx.setTransform(dpr, 0, 0, dpr, 0, 0);
const timeSpan = viewEndTs - viewStartTs;
if (timeSpan <= 0) {
return;
}
const cssWidth = canvas.width / dpr;
const metrics = getFlamegraphRowMetrics(rowHeight);
// ---- Vertical clipping window ----
const viewportHeight = container.clientHeight;
//starts drawing OVERSCAN_ROWS(4) rows above the visible area.
const firstLevel = Math.max(
0,
Math.floor(scrollTop / metrics.ROW_HEIGHT) - OVERSCAN_ROWS,
);
// adds 2*OVERSCAN_ROWS extra rows above and below the visible area.
const visibleLevelCount =
Math.ceil(viewportHeight / metrics.ROW_HEIGHT) + 2 * OVERSCAN_ROWS;
const lastLevel = Math.min(spans.length - 1, firstLevel + visibleLevelCount);
ctx.clearRect(0, 0, cssWidth, viewportHeight);
// ---- Draw connector lines (behind span bars) ----
drawConnectorLines({
ctx,
connectors,
scrollTop,
viewStartTs,
timeSpan,
cssWidth,
viewportHeight,
metrics,
});
const spanRectsArray: SpanRect[] = [];
const eventRectsArray: EventRect[] = [];
const currentHoveredEventKey = hoveredEventKey ?? null;
// ---- Draw only visible levels ----
for (let levelIndex = firstLevel; levelIndex <= lastLevel; levelIndex++) {
const levelSpans = spans[levelIndex];
if (!levelSpans) {
continue;
}
drawLevel({
ctx,
levelSpans,
levelIndex,
y: levelIndex * metrics.ROW_HEIGHT - scrollTop,
viewStartTs,
timeSpan,
cssWidth,
selectedSpanId,
hoveredSpanId,
isDarkMode,
spanRectsArray,
eventRectsArray,
metrics,
hoveredEventKey: currentHoveredEventKey,
});
}
spanRectsRef.current = spanRectsArray;
eventRectsRef.current = eventRectsArray;
}, [
canvasRef,
containerRef,
spanRectsRef,
eventRectsRef,
spans,
connectors,
viewStartTs,
viewEndTs,
scrollTop,
rowHeight,
selectedSpanId,
hoveredSpanId,
hoveredEventKey,
isDarkMode,
]);
return { drawFlamegraph, spanRectsRef, eventRectsRef };
}

View File

@@ -0,0 +1,295 @@
import {
Dispatch,
MouseEvent as ReactMouseEvent,
MutableRefObject,
RefObject,
SetStateAction,
useCallback,
useRef,
useState,
} from 'react';
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import { EventRect, SpanRect } from '../types';
import { ITraceMetadata } from '../types';
import { getSpanColor } from '../utils';
function getCanvasPointer(
canvas: HTMLCanvasElement,
clientX: number,
clientY: number,
): { cssX: number; cssY: number } | null {
const rect = canvas.getBoundingClientRect();
const dpr = window.devicePixelRatio || 1;
const cssWidth = canvas.width / dpr;
const cssHeight = canvas.height / dpr;
const cssX = (clientX - rect.left) * (cssWidth / rect.width);
const cssY = (clientY - rect.top) * (cssHeight / rect.height);
return { cssX, cssY };
}
function findSpanAtPosition(
cssX: number,
cssY: number,
spanRects: SpanRect[],
): FlamegraphSpan | null {
for (let i = spanRects.length - 1; i >= 0; i--) {
const r = spanRects[i];
if (
cssX >= r.x &&
cssX <= r.x + r.width &&
cssY >= r.y &&
cssY <= r.y + r.height
) {
return r.span;
}
}
return null;
}
function findEventAtPosition(
cssX: number,
cssY: number,
eventRects: EventRect[],
): EventRect | null {
for (let i = eventRects.length - 1; i >= 0; i--) {
const r = eventRects[i];
// Manhattan distance check for diamond shape with padding
if (Math.abs(r.cx - cssX) + Math.abs(r.cy - cssY) <= r.halfSize * 1.5) {
return r;
}
}
return null;
}
export interface EventTooltipData {
name: string;
timeOffsetMs: number;
isError: boolean;
attributeMap: Record<string, string>;
}
export interface TooltipContent {
serviceName: string;
spanName: string;
status: 'ok' | 'warning' | 'error';
startMs: number;
durationMs: number;
clientX: number;
clientY: number;
spanColor: string;
event?: EventTooltipData;
}
interface UseFlamegraphHoverArgs {
canvasRef: RefObject<HTMLCanvasElement>;
spanRectsRef: MutableRefObject<SpanRect[]>;
eventRectsRef: MutableRefObject<EventRect[]>;
traceMetadata: ITraceMetadata;
viewStartTs: number;
viewEndTs: number;
isDraggingRef: MutableRefObject<boolean>;
onSpanClick: (spanId: string) => void;
isDarkMode: boolean;
}
interface UseFlamegraphHoverResult {
hoveredSpanId: string | null;
setHoveredSpanId: Dispatch<SetStateAction<string | null>>;
hoveredEventKey: string | null;
handleHoverMouseMove: (e: ReactMouseEvent) => void;
handleHoverMouseLeave: () => void;
handleMouseDownForClick: (e: ReactMouseEvent) => void;
handleClick: (e: ReactMouseEvent) => void;
tooltipContent: TooltipContent | null;
}
export function useFlamegraphHover(
args: UseFlamegraphHoverArgs,
): UseFlamegraphHoverResult {
const {
canvasRef,
spanRectsRef,
eventRectsRef,
traceMetadata,
viewStartTs,
viewEndTs,
isDraggingRef,
onSpanClick,
isDarkMode,
} = args;
const [hoveredSpanId, setHoveredSpanId] = useState<string | null>(null);
const [hoveredEventKey, setHoveredEventKey] = useState<string | null>(null);
const [tooltipContent, setTooltipContent] = useState<TooltipContent | null>(
null,
);
const isZoomed =
viewStartTs !== traceMetadata.startTime ||
viewEndTs !== traceMetadata.endTime;
const updateCursor = useCallback(
(canvas: HTMLCanvasElement, span: FlamegraphSpan | null): void => {
if (span) {
canvas.style.cursor = 'pointer';
} else if (isZoomed) {
canvas.style.cursor = 'grab';
} else {
canvas.style.cursor = 'default';
}
},
[isZoomed],
);
const handleHoverMouseMove = useCallback(
(e: ReactMouseEvent): void => {
if (isDraggingRef.current) {
return;
}
const canvas = canvasRef.current;
if (!canvas) {
return;
}
const pointer = getCanvasPointer(canvas, e.clientX, e.clientY);
if (!pointer) {
return;
}
// Check event dots first — they're drawn on top of spans
const eventRect = findEventAtPosition(
pointer.cssX,
pointer.cssY,
eventRectsRef.current,
);
if (eventRect) {
const { event, span } = eventRect;
const eventTimeMs = event.timeUnixNano / 1e6;
setHoveredEventKey(`${span.spanId}-${event.name}-${event.timeUnixNano}`);
setHoveredSpanId(span.spanId);
setTooltipContent({
serviceName: span.serviceName || '',
spanName: span.name || 'unknown',
status: span.hasError ? 'error' : 'ok',
startMs: span.timestamp - traceMetadata.startTime,
durationMs: span.durationNano / 1e6,
clientX: e.clientX,
clientY: e.clientY,
spanColor: getSpanColor({ span, isDarkMode }),
event: {
name: event.name,
timeOffsetMs: eventTimeMs - span.timestamp,
isError: event.isError,
attributeMap: event.attributeMap || {},
},
});
updateCursor(canvas, eventRect.span);
return;
}
const span = findSpanAtPosition(
pointer.cssX,
pointer.cssY,
spanRectsRef.current,
);
if (span) {
setHoveredEventKey(null);
setHoveredSpanId(span.spanId);
setTooltipContent({
serviceName: span.serviceName || '',
spanName: span.name || 'unknown',
status: span.hasError ? 'error' : 'ok',
startMs: span.timestamp - traceMetadata.startTime,
durationMs: span.durationNano / 1e6,
clientX: e.clientX,
clientY: e.clientY,
spanColor: getSpanColor({ span, isDarkMode }),
});
updateCursor(canvas, span);
} else {
setHoveredEventKey(null);
setHoveredSpanId(null);
setTooltipContent(null);
updateCursor(canvas, null);
}
},
[
canvasRef,
spanRectsRef,
eventRectsRef,
traceMetadata.startTime,
isDraggingRef,
updateCursor,
isDarkMode,
],
);
const handleHoverMouseLeave = useCallback((): void => {
setHoveredEventKey(null);
setHoveredSpanId(null);
setTooltipContent(null);
const canvas = canvasRef.current;
if (canvas) {
updateCursor(canvas, null);
}
}, [canvasRef, updateCursor]);
const mouseDownPosRef = useRef<{ x: number; y: number } | null>(null);
const CLICK_THRESHOLD = 5;
const handleMouseDownForClick = useCallback((e: ReactMouseEvent): void => {
mouseDownPosRef.current = { x: e.clientX, y: e.clientY };
}, []);
const handleClick = useCallback(
(e: ReactMouseEvent): void => {
// Detect drag: if mouse moved more than threshold, skip click
if (mouseDownPosRef.current) {
const dx = e.clientX - mouseDownPosRef.current.x;
const dy = e.clientY - mouseDownPosRef.current.y;
if (Math.sqrt(dx * dx + dy * dy) > CLICK_THRESHOLD) {
mouseDownPosRef.current = null;
return;
}
}
mouseDownPosRef.current = null;
const canvas = canvasRef.current;
if (!canvas) {
return;
}
const pointer = getCanvasPointer(canvas, e.clientX, e.clientY);
if (!pointer) {
return;
}
const span = findSpanAtPosition(
pointer.cssX,
pointer.cssY,
spanRectsRef.current,
);
if (span) {
onSpanClick(span.spanId);
}
},
[canvasRef, spanRectsRef, onSpanClick],
);
return {
hoveredSpanId,
setHoveredSpanId,
hoveredEventKey,
handleHoverMouseMove,
handleHoverMouseLeave,
handleMouseDownForClick,
handleClick,
tooltipContent,
};
}

View File

@@ -0,0 +1,224 @@
import {
Dispatch,
MutableRefObject,
RefObject,
SetStateAction,
useCallback,
useEffect,
useRef,
} from 'react';
import {
DEFAULT_ROW_HEIGHT,
MAX_ROW_HEIGHT,
MIN_ROW_HEIGHT,
MIN_VISIBLE_SPAN_MS,
PINCH_ZOOM_INTENSITY_H,
PINCH_ZOOM_INTENSITY_V,
SCROLL_ZOOM_INTENSITY_H,
SCROLL_ZOOM_INTENSITY_V,
} from '../constants';
import { ITraceMetadata } from '../types';
import { clamp } from '../utils';
interface UseFlamegraphZoomArgs {
canvasRef: RefObject<HTMLCanvasElement>;
traceMetadata: ITraceMetadata;
viewStartRef: MutableRefObject<number>;
viewEndRef: MutableRefObject<number>;
rowHeightRef: MutableRefObject<number>;
setViewStartTs: Dispatch<SetStateAction<number>>;
setViewEndTs: Dispatch<SetStateAction<number>>;
setRowHeight: Dispatch<SetStateAction<number>>;
}
interface UseFlamegraphZoomResult {
handleResetZoom: () => void;
isOverFlamegraphRef: MutableRefObject<boolean>;
}
function getCanvasPointer(
canvasRef: RefObject<HTMLCanvasElement>,
clientX: number,
): { cssX: number; cssWidth: number } | null {
const canvas = canvasRef.current;
if (!canvas) {
return null;
}
const rect = canvas.getBoundingClientRect();
const dpr = window.devicePixelRatio || 1;
const cssWidth = canvas.width / dpr;
const cssX = (clientX - rect.left) * (cssWidth / rect.width);
return { cssX, cssWidth };
}
export function useFlamegraphZoom(
args: UseFlamegraphZoomArgs,
): UseFlamegraphZoomResult {
const {
canvasRef,
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
} = args;
const isOverFlamegraphRef = useRef(false);
const wheelDeltaRef = useRef(0);
const rafRef = useRef<number | null>(null);
const lastCursorXRef = useRef(0);
const lastCssWidthRef = useRef(1);
const lastIsPinchRef = useRef(false);
const lastWheelClientXRef = useRef<number | null>(null);
// Prevent browser zoom when pinching over the flamegraph
useEffect(() => {
const onWheel = (e: WheelEvent): void => {
if (isOverFlamegraphRef.current && e.ctrlKey) {
e.preventDefault();
}
};
window.addEventListener('wheel', onWheel, { passive: false, capture: true });
return (): void => {
window.removeEventListener('wheel', onWheel, {
capture: true,
} as EventListenerOptions);
};
}, []);
const applyWheelZoom = useCallback(() => {
rafRef.current = null;
const cssWidth = lastCssWidthRef.current || 1;
const cursorX = lastCursorXRef.current;
const fullSpanMs = traceMetadata.endTime - traceMetadata.startTime;
const oldStart = viewStartRef.current;
const oldEnd = viewEndRef.current;
const oldSpan = oldEnd - oldStart;
const deltaY = wheelDeltaRef.current;
wheelDeltaRef.current = 0;
if (deltaY === 0) {
return;
}
const zoomH = lastIsPinchRef.current
? PINCH_ZOOM_INTENSITY_H
: SCROLL_ZOOM_INTENSITY_H;
const zoomV = lastIsPinchRef.current
? PINCH_ZOOM_INTENSITY_V
: SCROLL_ZOOM_INTENSITY_V;
const factorH = Math.exp(deltaY * zoomH);
const factorV = Math.exp(deltaY * zoomV);
// --- Horizontal zoom ---
const desiredSpan = oldSpan * factorH;
const minSpanMs = Math.max(
MIN_VISIBLE_SPAN_MS,
oldSpan / Math.max(cssWidth, 1),
);
const clampedSpan = clamp(desiredSpan, minSpanMs, fullSpanMs);
const cursorRatio = clamp(cursorX / cssWidth, 0, 1);
const anchorTs = oldStart + cursorRatio * oldSpan;
let nextStart = anchorTs - cursorRatio * clampedSpan;
nextStart = clamp(
nextStart,
traceMetadata.startTime,
traceMetadata.endTime - clampedSpan,
);
const nextEnd = nextStart + clampedSpan;
// --- Vertical zoom (row height) ---
const desiredRow = rowHeightRef.current * (1 / factorV);
const nextRow = clamp(desiredRow, MIN_ROW_HEIGHT, MAX_ROW_HEIGHT);
// Write refs immediately so rapid wheel events read fresh values
viewStartRef.current = nextStart;
viewEndRef.current = nextEnd;
rowHeightRef.current = nextRow;
setViewStartTs(nextStart);
setViewEndTs(nextEnd);
setRowHeight(nextRow);
}, [
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
]);
// Native wheel listener on the canvas (passive: false for reliable preventDefault)
useEffect(() => {
const canvas = canvasRef.current;
if (!canvas) {
return (): void => {};
}
const onWheel = (e: WheelEvent): void => {
e.preventDefault();
const pointer = getCanvasPointer(canvasRef, e.clientX);
if (!pointer) {
return;
}
// Flush accumulated delta if cursor moved significantly
if (lastWheelClientXRef.current !== null) {
const moved = Math.abs(e.clientX - lastWheelClientXRef.current);
if (moved > 6) {
wheelDeltaRef.current = 0;
}
}
lastWheelClientXRef.current = e.clientX;
lastIsPinchRef.current = e.ctrlKey;
lastCssWidthRef.current = pointer.cssWidth;
lastCursorXRef.current = pointer.cssX;
wheelDeltaRef.current += e.deltaY;
if (rafRef.current == null) {
rafRef.current = requestAnimationFrame(applyWheelZoom);
}
};
canvas.addEventListener('wheel', onWheel, { passive: false });
return (): void => {
canvas.removeEventListener('wheel', onWheel);
};
}, [canvasRef, applyWheelZoom]);
const handleResetZoom = useCallback(() => {
viewStartRef.current = traceMetadata.startTime;
viewEndRef.current = traceMetadata.endTime;
rowHeightRef.current = DEFAULT_ROW_HEIGHT;
setViewStartTs(traceMetadata.startTime);
setViewEndTs(traceMetadata.endTime);
setRowHeight(DEFAULT_ROW_HEIGHT);
}, [
traceMetadata,
viewStartRef,
viewEndRef,
rowHeightRef,
setViewStartTs,
setViewEndTs,
setRowHeight,
]);
return { handleResetZoom, isOverFlamegraphRef };
}

View File

@@ -0,0 +1,118 @@
import {
Dispatch,
MutableRefObject,
RefObject,
SetStateAction,
useEffect,
} from 'react';
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import { MIN_VISIBLE_SPAN_MS } from '../constants';
import { ITraceMetadata } from '../types';
import { clamp, findSpanById, getFlamegraphRowMetrics } from '../utils';
interface UseScrollToSpanArgs {
firstSpanAtFetchLevel: string;
spans: FlamegraphSpan[][];
traceMetadata: ITraceMetadata;
containerRef: RefObject<HTMLDivElement>;
viewStartRef: MutableRefObject<number>;
viewEndRef: MutableRefObject<number>;
scrollTopRef: MutableRefObject<number>;
rowHeight: number;
setViewStartTs: Dispatch<SetStateAction<number>>;
setViewEndTs: Dispatch<SetStateAction<number>>;
setScrollTop: Dispatch<SetStateAction<number>>;
}
/**
* When firstSpanAtFetchLevel (from URL spanId) changes, scroll and zoom the
* flamegraph so the selected span is centered in view.
*/
export function useScrollToSpan(args: UseScrollToSpanArgs): void {
const {
firstSpanAtFetchLevel,
spans,
traceMetadata,
containerRef,
viewStartRef,
viewEndRef,
scrollTopRef,
rowHeight,
setViewStartTs,
setViewEndTs,
setScrollTop,
} = args;
useEffect(() => {
if (!firstSpanAtFetchLevel || spans.length === 0) {
return;
}
const result = findSpanById(spans, firstSpanAtFetchLevel);
if (!result) {
return;
}
const { span, levelIndex } = result;
const container = containerRef.current;
if (!container) {
return;
}
const metrics = getFlamegraphRowMetrics(rowHeight);
const viewportHeight = container.clientHeight;
const totalHeight = spans.length * metrics.ROW_HEIGHT;
const maxScroll = Math.max(0, totalHeight - viewportHeight);
// Vertical: center the span's row in the viewport
const targetScrollTop = clamp(
levelIndex * metrics.ROW_HEIGHT -
viewportHeight / 2 +
metrics.ROW_HEIGHT / 2,
0,
maxScroll,
);
// Horizontal: zoom to span with padding (2x span duration), center it
const spanStartMs = span.timestamp;
const spanEndMs = span.timestamp + span.durationNano / 1e6;
const spanDurationMs = spanEndMs - spanStartMs;
const spanCenterMs = (spanStartMs + spanEndMs) / 2;
const visibleWindowMs = Math.max(spanDurationMs * 2, MIN_VISIBLE_SPAN_MS);
const fullSpanMs = traceMetadata.endTime - traceMetadata.startTime;
const clampedWindow = clamp(visibleWindowMs, MIN_VISIBLE_SPAN_MS, fullSpanMs);
let targetViewStart = spanCenterMs - clampedWindow / 2;
let targetViewEnd = spanCenterMs + clampedWindow / 2;
targetViewStart = clamp(
targetViewStart,
traceMetadata.startTime,
traceMetadata.endTime - clampedWindow,
);
targetViewEnd = targetViewStart + clampedWindow;
// Apply immediately (instant jump)
viewStartRef.current = targetViewStart;
viewEndRef.current = targetViewEnd;
scrollTopRef.current = targetScrollTop;
setViewStartTs(targetViewStart);
setViewEndTs(targetViewEnd);
setScrollTop(targetScrollTop);
}, [
firstSpanAtFetchLevel,
spans,
traceMetadata,
containerRef,
viewStartRef,
viewEndRef,
scrollTopRef,
rowHeight,
setViewStartTs,
setViewEndTs,
setScrollTop,
]);
}

View File

@@ -0,0 +1,98 @@
import { useEffect, useRef, useState } from 'react';
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import { computeVisualLayout, VisualLayout } from '../computeVisualLayout';
import { LayoutWorkerResponse } from '../visualLayoutWorkerTypes';
const EMPTY_LAYOUT: VisualLayout = {
visualRows: [],
spanToVisualRow: {},
connectors: [],
totalVisualRows: 0,
};
function computeLayoutOrEmpty(spans: FlamegraphSpan[][]): VisualLayout {
return spans.length ? computeVisualLayout(spans) : EMPTY_LAYOUT;
}
function createLayoutWorker(): Worker {
return new Worker(new URL('../visualLayout.worker.ts', import.meta.url), {
type: 'module',
});
}
export function useVisualLayoutWorker(
spans: FlamegraphSpan[][],
): { layout: VisualLayout; isComputing: boolean } {
const [layout, setLayout] = useState<VisualLayout>(EMPTY_LAYOUT);
const [isComputing, setIsComputing] = useState(false);
const workerRef = useRef<Worker | null>(null);
const requestIdRef = useRef(0);
const fallbackRef = useRef(typeof Worker === 'undefined');
// Effect: post message to worker when spans change
useEffect(() => {
if (fallbackRef.current) {
setLayout(computeLayoutOrEmpty(spans));
return;
}
if (!workerRef.current) {
try {
workerRef.current = createLayoutWorker();
} catch {
fallbackRef.current = true;
setLayout(computeLayoutOrEmpty(spans));
return;
}
}
if (!spans.length) {
setLayout(EMPTY_LAYOUT);
return;
}
const currentId = ++requestIdRef.current;
setIsComputing(true);
const worker = workerRef.current;
const onMessage = (e: MessageEvent<LayoutWorkerResponse>): void => {
if (e.data.requestId !== requestIdRef.current) {
return;
}
if (e.data.type === 'result') {
setLayout(e.data.layout);
} else {
setLayout(computeVisualLayout(spans));
}
setIsComputing(false);
};
const onError = (): void => {
if (requestIdRef.current === currentId) {
setLayout(computeVisualLayout(spans));
setIsComputing(false);
}
};
worker.addEventListener('message', onMessage);
worker.addEventListener('error', onError);
worker.postMessage({ type: 'compute', requestId: currentId, spans });
return (): void => {
worker.removeEventListener('message', onMessage);
worker.removeEventListener('error', onError);
};
}, [spans]);
// Cleanup worker on unmount
useEffect(
() => (): void => {
workerRef.current?.terminate();
},
[],
);
return { layout, isComputing };
}

View File

@@ -0,0 +1,32 @@
import { Dispatch, SetStateAction } from 'react';
import { Event, FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
export interface ITraceMetadata {
startTime: number;
endTime: number;
}
export interface FlamegraphCanvasProps {
spans: FlamegraphSpan[][];
firstSpanAtFetchLevel: string;
setFirstSpanAtFetchLevel: Dispatch<SetStateAction<string>>;
onSpanClick: (spanId: string) => void;
traceMetadata: ITraceMetadata;
}
export interface SpanRect {
span: FlamegraphSpan;
x: number;
y: number;
width: number;
height: number;
level: number;
}
export interface EventRect {
event: Event;
span: FlamegraphSpan;
cx: number;
cy: number;
halfSize: number;
}

View File

@@ -0,0 +1,424 @@
import { themeColors } from 'constants/theme';
import { convertTimeToRelevantUnit } from 'container/TraceDetail/utils';
import { generateColor } from 'lib/uPlotLib/utils/generateColor';
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import {
DASHED_BORDER_LINE_DASH,
EVENT_DOT_SIZE_RATIO,
LABEL_FONT,
LABEL_PADDING_X,
MAX_EVENT_DOT_SIZE,
MAX_SPAN_BAR_HEIGHT,
MIN_EVENT_DOT_SIZE,
MIN_SPAN_BAR_HEIGHT,
MIN_WIDTH_FOR_NAME,
MIN_WIDTH_FOR_NAME_AND_DURATION,
SPAN_BAR_HEIGHT_RATIO,
} from './constants';
import { EventRect, SpanRect } from './types';
export function clamp(v: number, min: number, max: number): number {
return Math.max(min, Math.min(max, v));
}
/** Create diagonal stripe pattern for selected/hovered span (repeating-linear-gradient -45deg style). */
function createStripePattern(
ctx: CanvasRenderingContext2D,
color: string,
): CanvasPattern | null {
const size = 20;
const patternCanvas = document.createElement('canvas');
patternCanvas.width = size;
patternCanvas.height = size;
const pCtx = patternCanvas.getContext('2d');
if (!pCtx) {
return null;
}
// Diagonal stripes at -45deg: 10px transparent, 10px colored (0.04 opacity), repeat
pCtx.globalAlpha = 0.04;
pCtx.strokeStyle = color;
pCtx.lineWidth = 10;
pCtx.lineCap = 'butt';
for (let i = -size; i < size * 2; i += size) {
pCtx.beginPath();
pCtx.moveTo(i + size, 0);
pCtx.lineTo(i, size);
pCtx.stroke();
}
pCtx.globalAlpha = 1;
return ctx.createPattern(patternCanvas, 'repeat');
}
export function findSpanById(
spans: FlamegraphSpan[][],
spanId: string,
): { span: FlamegraphSpan; levelIndex: number } | null {
for (let levelIndex = 0; levelIndex < spans.length; levelIndex++) {
const span = spans[levelIndex]?.find((s) => s.spanId === spanId);
if (span) {
return { span, levelIndex };
}
}
return null;
}
export interface FlamegraphRowMetrics {
ROW_HEIGHT: number;
SPAN_BAR_HEIGHT: number;
SPAN_BAR_Y_OFFSET: number;
EVENT_DOT_SIZE: number;
}
export function getFlamegraphRowMetrics(
rowHeight: number,
): FlamegraphRowMetrics {
const spanBarHeight = clamp(
Math.round(rowHeight * SPAN_BAR_HEIGHT_RATIO),
MIN_SPAN_BAR_HEIGHT,
MAX_SPAN_BAR_HEIGHT,
);
const spanBarYOffset = Math.floor((rowHeight - spanBarHeight) / 2);
const eventDotSize = clamp(
Math.round(spanBarHeight * EVENT_DOT_SIZE_RATIO),
MIN_EVENT_DOT_SIZE,
MAX_EVENT_DOT_SIZE,
);
return {
ROW_HEIGHT: rowHeight,
SPAN_BAR_HEIGHT: spanBarHeight,
SPAN_BAR_Y_OFFSET: spanBarYOffset,
EVENT_DOT_SIZE: eventDotSize,
};
}
interface GetSpanColorArgs {
span: FlamegraphSpan;
isDarkMode: boolean;
}
export function getSpanColor(args: GetSpanColorArgs): string {
const { span, isDarkMode } = args;
let color = generateColor(span.serviceName, themeColors.traceDetailColorsV3);
if (span.hasError) {
color = isDarkMode ? 'rgb(239, 68, 68)' : 'rgb(220, 38, 38)';
}
return color;
}
export interface EventDotColor {
fill: string;
stroke: string;
}
/** Derive event dot colors from parent span color. Error events always use red. */
export function getEventDotColor(
spanColor: string,
isError: boolean,
isDarkMode: boolean,
): EventDotColor {
if (isError) {
return {
fill: isDarkMode ? 'rgb(239, 68, 68)' : 'rgb(220, 38, 38)',
stroke: isDarkMode ? 'rgb(185, 28, 28)' : 'rgb(153, 27, 27)',
};
}
// Parse the span color (hex or rgb) to darken it for the event dot
let r: number | undefined;
let g: number | undefined;
let b: number | undefined;
const rgbMatch = spanColor.match(
/rgba?\(\s*(\d+)\s*,\s*(\d+)\s*,\s*(\d+)\s*(?:,\s*[\d.]+)?\s*\)/,
);
const hexMatch = spanColor.match(
/^#([0-9a-f]{2})([0-9a-f]{2})([0-9a-f]{2})$/i,
);
if (rgbMatch) {
r = parseInt(rgbMatch[1], 10);
g = parseInt(rgbMatch[2], 10);
b = parseInt(rgbMatch[3], 10);
} else if (hexMatch) {
r = parseInt(hexMatch[1], 16);
g = parseInt(hexMatch[2], 16);
b = parseInt(hexMatch[3], 16);
}
if (r !== undefined && g !== undefined && b !== undefined) {
// Darken by 20% for fill, 40% for stroke
const darken = (v: number, factor: number): number =>
Math.round(v * (1 - factor));
return {
fill: `rgb(${darken(r, 0.2)}, ${darken(g, 0.2)}, ${darken(b, 0.2)})`,
stroke: `rgb(${darken(r, 0.4)}, ${darken(g, 0.4)}, ${darken(b, 0.4)})`,
};
}
// Fallback to original cyan/blue
return {
fill: isDarkMode ? 'rgb(14, 165, 233)' : 'rgb(6, 182, 212)',
stroke: isDarkMode ? 'rgb(2, 132, 199)' : 'rgb(8, 145, 178)',
};
}
interface DrawEventDotArgs {
ctx: CanvasRenderingContext2D;
x: number;
y: number;
color: EventDotColor;
eventDotSize: number;
}
export function drawEventDot(args: DrawEventDotArgs): void {
const { ctx, x, y, color, eventDotSize } = args;
ctx.save();
ctx.translate(x, y);
ctx.rotate(Math.PI / 4);
ctx.fillStyle = color.fill;
ctx.strokeStyle = color.stroke;
ctx.lineWidth = 1;
const half = eventDotSize / 2;
ctx.fillRect(-half, -half, eventDotSize, eventDotSize);
ctx.strokeRect(-half, -half, eventDotSize, eventDotSize);
ctx.restore();
}
interface DrawSpanBarArgs {
ctx: CanvasRenderingContext2D;
span: FlamegraphSpan;
x: number;
y: number;
width: number;
levelIndex: number;
spanRectsArray: SpanRect[];
eventRectsArray: EventRect[];
color: string;
isDarkMode: boolean;
metrics: FlamegraphRowMetrics;
selectedSpanId?: string | null;
hoveredSpanId?: string | null;
hoveredEventKey?: string | null;
}
export function drawSpanBar(args: DrawSpanBarArgs): void {
const {
ctx,
span,
x,
y,
width,
levelIndex,
spanRectsArray,
eventRectsArray,
color,
isDarkMode,
metrics,
selectedSpanId,
hoveredSpanId,
hoveredEventKey,
} = args;
const spanY = y + metrics.SPAN_BAR_Y_OFFSET;
const isSelected = selectedSpanId === span.spanId;
const isHovered = hoveredSpanId === span.spanId;
const isSelectedOrHovered = isSelected || isHovered;
ctx.beginPath();
ctx.roundRect(x, spanY, width, metrics.SPAN_BAR_HEIGHT, 2);
if (isSelectedOrHovered) {
// Diagonal stripe pattern (repeating-linear-gradient -45deg style) + border in span color
const pattern = createStripePattern(ctx, color);
if (pattern) {
ctx.fillStyle = pattern;
ctx.fill();
}
if (isSelected) {
ctx.setLineDash(DASHED_BORDER_LINE_DASH);
}
ctx.strokeStyle = color;
ctx.lineWidth = isSelected ? 2 : 1;
ctx.stroke();
if (isSelected) {
ctx.setLineDash([]);
}
} else {
ctx.fillStyle = color;
ctx.fill();
}
spanRectsArray.push({
span,
x,
y: spanY,
width,
height: metrics.SPAN_BAR_HEIGHT,
level: levelIndex,
});
span.event?.forEach((event) => {
const spanDurationMs = span.durationNano / 1e6;
if (spanDurationMs <= 0) {
return;
}
const eventTimeMs = event.timeUnixNano / 1e6;
const eventOffsetPercent =
((eventTimeMs - span.timestamp) / spanDurationMs) * 100;
const clampedOffset = clamp(eventOffsetPercent, 1, 99);
const eventX = x + (clampedOffset / 100) * width;
const eventY = spanY + metrics.SPAN_BAR_HEIGHT / 2;
const dotColor = getEventDotColor(color, event.isError, isDarkMode);
const eventKey = `${span.spanId}-${event.name}-${event.timeUnixNano}`;
const isEventHovered = hoveredEventKey === eventKey;
const dotSize = isEventHovered
? Math.round(metrics.EVENT_DOT_SIZE * 1.5)
: metrics.EVENT_DOT_SIZE;
drawEventDot({
ctx,
x: eventX,
y: eventY,
color: dotColor,
eventDotSize: dotSize,
});
eventRectsArray.push({
event,
span,
cx: eventX,
cy: eventY,
halfSize: metrics.EVENT_DOT_SIZE / 2,
});
});
drawSpanLabel({
ctx,
span,
x,
y: spanY,
width,
color,
isSelectedOrHovered,
isDarkMode,
spanBarHeight: metrics.SPAN_BAR_HEIGHT,
});
}
export function formatDuration(durationNano: number): string {
const durationMs = durationNano / 1e6;
const { time, timeUnitName } = convertTimeToRelevantUnit(durationMs);
return `${parseFloat(time.toFixed(2))}${timeUnitName}`;
}
interface DrawSpanLabelArgs {
ctx: CanvasRenderingContext2D;
span: FlamegraphSpan;
x: number;
y: number;
width: number;
color: string;
isSelectedOrHovered: boolean;
isDarkMode: boolean;
spanBarHeight: number;
}
function drawSpanLabel(args: DrawSpanLabelArgs): void {
const {
ctx,
span,
x,
y,
width,
color,
isSelectedOrHovered,
isDarkMode,
spanBarHeight,
} = args;
if (width < MIN_WIDTH_FOR_NAME) {
return;
}
const name = span.name;
ctx.save();
// Clip text to span bar bounds
ctx.beginPath();
ctx.rect(x, y, width, spanBarHeight);
ctx.clip();
ctx.font = LABEL_FONT;
ctx.fillStyle = isSelectedOrHovered
? color
: isDarkMode
? 'rgba(0, 0, 0, 0.9)'
: 'rgba(255, 255, 255, 0.9)';
ctx.textBaseline = 'middle';
const textY = y + spanBarHeight / 2;
const leftX = x + LABEL_PADDING_X;
const rightX = x + width - LABEL_PADDING_X;
const availableWidth = width - LABEL_PADDING_X * 2;
if (width >= MIN_WIDTH_FOR_NAME_AND_DURATION) {
const duration = formatDuration(span.durationNano);
const durationWidth = ctx.measureText(duration).width;
const minGap = 6;
const nameSpace = availableWidth - durationWidth - minGap;
// Duration right-aligned
ctx.textAlign = 'right';
ctx.fillText(duration, rightX, textY);
// Name left-aligned, truncated to fit remaining space
if (nameSpace > 20) {
ctx.textAlign = 'left';
ctx.fillText(truncateText(ctx, name, nameSpace), leftX, textY);
}
} else {
// Name only, truncated to fit
ctx.textAlign = 'left';
ctx.fillText(truncateText(ctx, name, availableWidth), leftX, textY);
}
ctx.restore();
}
function truncateText(
ctx: CanvasRenderingContext2D,
text: string,
maxWidth: number,
): string {
const ellipsis = '...';
const ellipsisWidth = ctx.measureText(ellipsis).width;
if (ctx.measureText(text).width <= maxWidth) {
return text;
}
let lo = 0;
let hi = text.length;
while (lo < hi) {
const mid = Math.ceil((lo + hi) / 2);
if (ctx.measureText(text.slice(0, mid)).width + ellipsisWidth <= maxWidth) {
lo = mid;
} else {
hi = mid - 1;
}
}
return lo > 0 ? `${text.slice(0, lo)}${ellipsis}` : ellipsis;
}

View File

@@ -0,0 +1,26 @@
/// <reference lib="webworker" />
import { computeVisualLayout } from './computeVisualLayout';
import {
LayoutWorkerRequest,
LayoutWorkerResponse,
} from './visualLayoutWorkerTypes';
self.onmessage = (event: MessageEvent<LayoutWorkerRequest>): void => {
const { requestId, spans } = event.data;
try {
const layout = computeVisualLayout(spans);
const response: LayoutWorkerResponse = {
type: 'result',
requestId,
layout,
};
self.postMessage(response);
} catch (err) {
const response: LayoutWorkerResponse = {
type: 'error',
requestId,
message: String(err),
};
self.postMessage(response);
}
};

View File

@@ -0,0 +1,13 @@
import { FlamegraphSpan } from 'types/api/trace/getTraceFlamegraph';
import { VisualLayout } from './computeVisualLayout';
export interface LayoutWorkerRequest {
type: 'compute';
requestId: number;
spans: FlamegraphSpan[][];
}
export type LayoutWorkerResponse =
| { type: 'result'; requestId: number; layout: VisualLayout }
| { type: 'error'; requestId: number; message: string };

View File

@@ -4,7 +4,8 @@ export interface TraceDetailFlamegraphURLProps {
export interface GetTraceFlamegraphPayloadProps {
traceId: string;
selectedSpanId: string;
selectedSpanId?: string;
limit?: number;
}
export interface Event {

View File

@@ -63,6 +63,7 @@ type RetryConfig struct {
func newConfig() factory.Config {
return Config{
Provider: "noop",
BufferSize: 1000,
BatchSize: 100,
FlushInterval: time.Second,

View File

@@ -40,6 +40,7 @@ type querier struct {
promEngine prometheus.Prometheus
traceStmtBuilder qbtypes.StatementBuilder[qbtypes.TraceAggregation]
logStmtBuilder qbtypes.StatementBuilder[qbtypes.LogAggregation]
auditStmtBuilder qbtypes.StatementBuilder[qbtypes.LogAggregation]
metricStmtBuilder qbtypes.StatementBuilder[qbtypes.MetricAggregation]
meterStmtBuilder qbtypes.StatementBuilder[qbtypes.MetricAggregation]
traceOperatorStmtBuilder qbtypes.TraceOperatorStatementBuilder
@@ -56,6 +57,7 @@ func New(
promEngine prometheus.Prometheus,
traceStmtBuilder qbtypes.StatementBuilder[qbtypes.TraceAggregation],
logStmtBuilder qbtypes.StatementBuilder[qbtypes.LogAggregation],
auditStmtBuilder qbtypes.StatementBuilder[qbtypes.LogAggregation],
metricStmtBuilder qbtypes.StatementBuilder[qbtypes.MetricAggregation],
meterStmtBuilder qbtypes.StatementBuilder[qbtypes.MetricAggregation],
traceOperatorStmtBuilder qbtypes.TraceOperatorStatementBuilder,
@@ -69,6 +71,7 @@ func New(
promEngine: promEngine,
traceStmtBuilder: traceStmtBuilder,
logStmtBuilder: logStmtBuilder,
auditStmtBuilder: auditStmtBuilder,
metricStmtBuilder: metricStmtBuilder,
meterStmtBuilder: meterStmtBuilder,
traceOperatorStmtBuilder: traceOperatorStmtBuilder,
@@ -361,7 +364,11 @@ func (q *querier) QueryRange(ctx context.Context, orgID valuer.UUID, req *qbtype
case qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]:
spec.ShiftBy = extractShiftFromBuilderQuery(spec)
timeRange := adjustTimeRangeForShift(spec, qbtypes.TimeRange{From: req.Start, To: req.End}, req.RequestType)
bq := newBuilderQuery(q.logger, q.telemetryStore, q.logStmtBuilder, spec, timeRange, req.RequestType, tmplVars)
stmtBuilder := q.logStmtBuilder
if spec.Source == telemetrytypes.SourceAudit {
stmtBuilder = q.auditStmtBuilder
}
bq := newBuilderQuery(q.logger, q.telemetryStore, stmtBuilder, spec, timeRange, req.RequestType, tmplVars)
queries[spec.Name] = bq
steps[spec.Name] = spec.StepInterval
case qbtypes.QueryBuilderQuery[qbtypes.MetricAggregation]:
@@ -550,7 +557,11 @@ func (q *querier) QueryRawStream(ctx context.Context, orgID valuer.UUID, req *qb
case <-tick:
// timestamp end is not specified here
timeRange := adjustTimeRangeForShift(spec, qbtypes.TimeRange{From: tsStart}, req.RequestType)
bq := newBuilderQuery(q.logger, q.telemetryStore, q.logStmtBuilder, spec, timeRange, req.RequestType, map[string]qbtypes.VariableItem{
liveTailStmtBuilder := q.logStmtBuilder
if spec.Source == telemetrytypes.SourceAudit {
liveTailStmtBuilder = q.auditStmtBuilder
}
bq := newBuilderQuery(q.logger, q.telemetryStore, liveTailStmtBuilder, spec, timeRange, req.RequestType, map[string]qbtypes.VariableItem{
"id": {
Value: updatedLogID,
},
@@ -850,7 +861,11 @@ func (q *querier) createRangedQuery(originalQuery qbtypes.Query, timeRange qbtyp
specCopy := qt.spec.Copy()
specCopy.ShiftBy = extractShiftFromBuilderQuery(specCopy)
adjustedTimeRange := adjustTimeRangeForShift(specCopy, timeRange, qt.kind)
return newBuilderQuery(q.logger, q.telemetryStore, q.logStmtBuilder, specCopy, adjustedTimeRange, qt.kind, qt.variables)
shiftStmtBuilder := q.logStmtBuilder
if qt.spec.Source == telemetrytypes.SourceAudit {
shiftStmtBuilder = q.auditStmtBuilder
}
return newBuilderQuery(q.logger, q.telemetryStore, shiftStmtBuilder, specCopy, adjustedTimeRange, qt.kind, qt.variables)
case *builderQuery[qbtypes.MetricAggregation]:
specCopy := qt.spec.Copy()

View File

@@ -47,6 +47,7 @@ func TestQueryRange_MetricTypeMissing(t *testing.T) {
nil, // prometheus
nil, // traceStmtBuilder
nil, // logStmtBuilder
nil, // auditStmtBuilder
nil, // metricStmtBuilder
nil, // meterStmtBuilder
nil, // traceOperatorStmtBuilder
@@ -110,6 +111,7 @@ func TestQueryRange_MetricTypeFromStore(t *testing.T) {
nil, // prometheus
nil, // traceStmtBuilder
nil, // logStmtBuilder
nil, // auditStmtBuilder
&mockMetricStmtBuilder{}, // metricStmtBuilder
nil, // meterStmtBuilder
nil, // traceOperatorStmtBuilder

View File

@@ -9,6 +9,7 @@ import (
"github.com/SigNoz/signoz/pkg/prometheus"
"github.com/SigNoz/signoz/pkg/querier"
"github.com/SigNoz/signoz/pkg/querybuilder"
"github.com/SigNoz/signoz/pkg/telemetryaudit"
"github.com/SigNoz/signoz/pkg/telemetrylogs"
"github.com/SigNoz/signoz/pkg/telemetrymetadata"
"github.com/SigNoz/signoz/pkg/telemetrymeter"
@@ -63,6 +64,11 @@ func newProvider(
telemetrylogs.TagAttributesV2TableName,
telemetrylogs.LogAttributeKeysTblName,
telemetrylogs.LogResourceKeysTblName,
telemetryaudit.DBName,
telemetryaudit.AuditLogsTableName,
telemetryaudit.TagAttributesTableName,
telemetryaudit.LogAttributeKeysTblName,
telemetryaudit.LogResourceKeysTblName,
telemetrymetadata.DBName,
telemetrymetadata.AttributesMetadataLocalTableName,
telemetrymetadata.ColumnEvolutionMetadataTableName,
@@ -82,13 +88,13 @@ func newProvider(
telemetryStore,
)
// ADD: Create trace operator statement builder
// Create trace operator statement builder
traceOperatorStmtBuilder := telemetrytraces.NewTraceOperatorStatementBuilder(
settings,
telemetryMetadataStore,
traceFieldMapper,
traceConditionBuilder,
traceStmtBuilder, // Pass the regular trace statement builder
traceStmtBuilder,
traceAggExprRewriter,
)
@@ -112,6 +118,26 @@ func newProvider(
telemetrylogs.GetBodyJSONKey,
)
// Create audit statement builder
auditFieldMapper := telemetryaudit.NewFieldMapper()
auditConditionBuilder := telemetryaudit.NewConditionBuilder(auditFieldMapper)
auditAggExprRewriter := querybuilder.NewAggExprRewriter(
settings,
telemetryaudit.DefaultFullTextColumn,
auditFieldMapper,
auditConditionBuilder,
nil,
)
auditStmtBuilder := telemetryaudit.NewAuditQueryStatementBuilder(
settings,
telemetryMetadataStore,
auditFieldMapper,
auditConditionBuilder,
auditAggExprRewriter,
telemetryaudit.DefaultFullTextColumn,
nil,
)
// Create metric statement builder
metricFieldMapper := telemetrymetrics.NewFieldMapper()
metricConditionBuilder := telemetrymetrics.NewConditionBuilder(metricFieldMapper)
@@ -148,6 +174,7 @@ func newProvider(
prometheus,
traceStmtBuilder,
logStmtBuilder,
auditStmtBuilder,
metricStmtBuilder,
meterStmtBuilder,
traceOperatorStmtBuilder,

View File

@@ -208,7 +208,7 @@ func (s *Server) createPublicServer(api *APIHandler, web web.Web) (*http.Server,
s.config.APIServer.Timeout.Default,
s.config.APIServer.Timeout.Max,
).Wrap)
r.Use(middleware.NewAudit(s.signoz.Instrumentation.Logger(), s.config.APIServer.Logging.ExcludedRoutes, nil).Wrap)
r.Use(middleware.NewAudit(s.signoz.Instrumentation.Logger(), s.config.APIServer.Logging.ExcludedRoutes, s.signoz.Auditor).Wrap)
r.Use(middleware.NewComment().Wrap)
am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger(), s.signoz.Modules.OrgGetter, s.signoz.Authz)

View File

@@ -46,6 +46,7 @@ func prepareQuerierForMetrics(t *testing.T, telemetryStore telemetrystore.Teleme
nil, // prometheus
nil, // traceStmtBuilder
nil, // logStmtBuilder
nil, // auditStmtBuilder
metricStmtBuilder,
nil, // meterStmtBuilder
nil, // traceOperatorStmtBuilder
@@ -91,6 +92,7 @@ func prepareQuerierForLogs(telemetryStore telemetrystore.TelemetryStore, keysMap
nil, // prometheus
nil, // traceStmtBuilder
logStmtBuilder, // logStmtBuilder
nil, // auditStmtBuilder
nil, // metricStmtBuilder
nil, // meterStmtBuilder
nil, // traceOperatorStmtBuilder
@@ -131,6 +133,7 @@ func prepareQuerierForTraces(telemetryStore telemetrystore.TelemetryStore, keysM
nil, // prometheus
traceStmtBuilder, // traceStmtBuilder
nil, // logStmtBuilder
nil, // auditStmtBuilder
nil, // metricStmtBuilder
nil, // meterStmtBuilder
nil, // traceOperatorStmtBuilder

View File

@@ -11,6 +11,7 @@ import (
"github.com/SigNoz/signoz/pkg/alertmanager"
"github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/apiserver"
"github.com/SigNoz/signoz/pkg/auditor"
"github.com/SigNoz/signoz/pkg/cache"
"github.com/SigNoz/signoz/pkg/config"
"github.com/SigNoz/signoz/pkg/emailing"
@@ -123,6 +124,9 @@ type Config struct {
// ServiceAccount config
ServiceAccount serviceaccount.Config `mapstructure:"serviceaccount"`
// Auditor config
Auditor auditor.Config `mapstructure:"auditor"`
}
func NewConfig(ctx context.Context, logger *slog.Logger, resolverConfig config.ResolverConfig) (Config, error) {
@@ -153,6 +157,7 @@ func NewConfig(ctx context.Context, logger *slog.Logger, resolverConfig config.R
user.NewConfigFactory(),
identn.NewConfigFactory(),
serviceaccount.NewConfigFactory(),
auditor.NewConfigFactory(),
}
conf, err := config.New(ctx, resolverConfig, configFactories)

View File

@@ -3,6 +3,8 @@ package signoz
import (
"github.com/SigNoz/signoz/pkg/alertmanager"
"github.com/SigNoz/signoz/pkg/alertmanager/nfmanager"
"github.com/SigNoz/signoz/pkg/auditor"
"github.com/SigNoz/signoz/pkg/auditor/noopauditor"
"github.com/SigNoz/signoz/pkg/alertmanager/nfmanager/rulebasednotification"
"github.com/SigNoz/signoz/pkg/alertmanager/signozalertmanager"
"github.com/SigNoz/signoz/pkg/analytics"
@@ -312,6 +314,12 @@ func NewGlobalProviderFactories(identNConfig identn.Config) factory.NamedMap[fac
)
}
func NewAuditorProviderFactories() factory.NamedMap[factory.ProviderFactory[auditor.Auditor, auditor.Config]] {
return factory.MustNewNamedMap(
noopauditor.NewFactory(),
)
}
func NewFlaggerProviderFactories(registry featuretypes.Registry) factory.NamedMap[factory.ProviderFactory[flagger.FlaggerProvider, flagger.Config]] {
return factory.MustNewNamedMap(
configflagger.NewFactory(registry),

View File

@@ -6,6 +6,7 @@ import (
"github.com/SigNoz/signoz/pkg/alertmanager"
"github.com/SigNoz/signoz/pkg/alertmanager/nfmanager"
"github.com/SigNoz/signoz/pkg/auditor"
"github.com/SigNoz/signoz/pkg/alertmanager/nfmanager/nfroutingstore/sqlroutingstore"
"github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/apiserver"
@@ -33,6 +34,7 @@ import (
"github.com/SigNoz/signoz/pkg/sqlschema"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/statsreporter"
"github.com/SigNoz/signoz/pkg/telemetryaudit"
"github.com/SigNoz/signoz/pkg/telemetrylogs"
"github.com/SigNoz/signoz/pkg/telemetrymetadata"
"github.com/SigNoz/signoz/pkg/telemetrymeter"
@@ -74,6 +76,7 @@ type SigNoz struct {
QueryParser queryparser.QueryParser
Flagger flagger.Flagger
Gateway gateway.Gateway
Auditor auditor.Auditor
}
func New(
@@ -93,6 +96,7 @@ func New(
authzCallback func(context.Context, sqlstore.SQLStore, licensing.Licensing, dashboard.Module) (factory.ProviderFactory[authz.AuthZ, authz.Config], error),
dashboardModuleCallback func(sqlstore.SQLStore, factory.ProviderSettings, analytics.Analytics, organization.Getter, queryparser.QueryParser, querier.Querier, licensing.Licensing) dashboard.Module,
gatewayProviderFactory func(licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config],
auditorProviderFactories func(licensing.Licensing) factory.NamedMap[factory.ProviderFactory[auditor.Auditor, auditor.Config]],
querierHandlerCallback func(factory.ProviderSettings, querier.Querier, analytics.Analytics) querier.Handler,
) (*SigNoz, error) {
// Initialize instrumentation
@@ -370,6 +374,12 @@ func New(
return nil, err
}
// Initialize auditor from the variant-specific provider factories
auditor, err := factory.NewProviderFromNamedMap(ctx, providerSettings, config.Auditor, auditorProviderFactories(licensing), config.Auditor.Provider)
if err != nil {
return nil, err
}
// Initialize authns
store := sqlauthnstore.NewStore(sqlstore)
authNs, err := authNsCallback(ctx, providerSettings, store, licensing)
@@ -395,6 +405,11 @@ func New(
telemetrylogs.TagAttributesV2TableName,
telemetrylogs.LogAttributeKeysTblName,
telemetrylogs.LogResourceKeysTblName,
telemetryaudit.DBName,
telemetryaudit.AuditLogsTableName,
telemetryaudit.TagAttributesTableName,
telemetryaudit.LogAttributeKeysTblName,
telemetryaudit.LogResourceKeysTblName,
telemetrymetadata.DBName,
telemetrymetadata.AttributesMetadataLocalTableName,
telemetrymetadata.ColumnEvolutionMetadataTableName,
@@ -464,6 +479,7 @@ func New(
factory.NewNamedService(factory.MustNewName("tokenizer"), tokenizer),
factory.NewNamedService(factory.MustNewName("authz"), authz),
factory.NewNamedService(factory.MustNewName("user"), userService, factory.MustNewName("authz")),
factory.NewNamedService(factory.MustNewName("auditor"), auditor),
)
if err != nil {
return nil, err
@@ -510,5 +526,6 @@ func New(
QueryParser: queryParser,
Flagger: flagger,
Gateway: gateway,
Auditor: auditor,
}, nil
}

View File

@@ -0,0 +1,200 @@
package telemetryaudit
import (
"context"
"fmt"
schema "github.com/SigNoz/signoz-otel-collector/cmd/signozschemamigrator/schema_migrator"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/querybuilder"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/huandu/go-sqlbuilder"
)
type conditionBuilder struct {
fm qbtypes.FieldMapper
}
func NewConditionBuilder(fm qbtypes.FieldMapper) *conditionBuilder {
return &conditionBuilder{fm: fm}
}
func (c *conditionBuilder) conditionFor(
ctx context.Context,
startNs, endNs uint64,
key *telemetrytypes.TelemetryFieldKey,
operator qbtypes.FilterOperator,
value any,
sb *sqlbuilder.SelectBuilder,
) (string, error) {
columns, err := c.fm.ColumnFor(ctx, startNs, endNs, key)
if err != nil {
return "", err
}
if operator.IsStringSearchOperator() {
value = querybuilder.FormatValueForContains(value)
}
fieldExpression, err := c.fm.FieldFor(ctx, startNs, endNs, key)
if err != nil {
return "", err
}
fieldExpression, value = querybuilder.DataTypeCollisionHandledFieldName(key, value, fieldExpression, operator)
switch operator {
case qbtypes.FilterOperatorEqual:
return sb.E(fieldExpression, value), nil
case qbtypes.FilterOperatorNotEqual:
return sb.NE(fieldExpression, value), nil
case qbtypes.FilterOperatorGreaterThan:
return sb.G(fieldExpression, value), nil
case qbtypes.FilterOperatorGreaterThanOrEq:
return sb.GE(fieldExpression, value), nil
case qbtypes.FilterOperatorLessThan:
return sb.LT(fieldExpression, value), nil
case qbtypes.FilterOperatorLessThanOrEq:
return sb.LE(fieldExpression, value), nil
case qbtypes.FilterOperatorLike:
return sb.Like(fieldExpression, value), nil
case qbtypes.FilterOperatorNotLike:
return sb.NotLike(fieldExpression, value), nil
case qbtypes.FilterOperatorILike:
return sb.ILike(fieldExpression, value), nil
case qbtypes.FilterOperatorNotILike:
return sb.NotILike(fieldExpression, value), nil
case qbtypes.FilterOperatorContains:
return sb.ILike(fieldExpression, fmt.Sprintf("%%%s%%", value)), nil
case qbtypes.FilterOperatorNotContains:
return sb.NotILike(fieldExpression, fmt.Sprintf("%%%s%%", value)), nil
case qbtypes.FilterOperatorRegexp:
return fmt.Sprintf(`match(%s, %s)`, sqlbuilder.Escape(fieldExpression), sb.Var(value)), nil
case qbtypes.FilterOperatorNotRegexp:
return fmt.Sprintf(`NOT match(%s, %s)`, sqlbuilder.Escape(fieldExpression), sb.Var(value)), nil
case qbtypes.FilterOperatorBetween:
values, ok := value.([]any)
if !ok {
return "", qbtypes.ErrBetweenValues
}
if len(values) != 2 {
return "", qbtypes.ErrBetweenValues
}
return sb.Between(fieldExpression, values[0], values[1]), nil
case qbtypes.FilterOperatorNotBetween:
values, ok := value.([]any)
if !ok {
return "", qbtypes.ErrBetweenValues
}
if len(values) != 2 {
return "", qbtypes.ErrBetweenValues
}
return sb.NotBetween(fieldExpression, values[0], values[1]), nil
case qbtypes.FilterOperatorIn:
values, ok := value.([]any)
if !ok {
return "", qbtypes.ErrInValues
}
conditions := []string{}
for _, value := range values {
conditions = append(conditions, sb.E(fieldExpression, value))
}
return sb.Or(conditions...), nil
case qbtypes.FilterOperatorNotIn:
values, ok := value.([]any)
if !ok {
return "", qbtypes.ErrInValues
}
conditions := []string{}
for _, value := range values {
conditions = append(conditions, sb.NE(fieldExpression, value))
}
return sb.And(conditions...), nil
case qbtypes.FilterOperatorExists, qbtypes.FilterOperatorNotExists:
var value any
column := columns[0]
switch column.Type.GetType() {
case schema.ColumnTypeEnumJSON:
if operator == qbtypes.FilterOperatorExists {
return sb.IsNotNull(fieldExpression), nil
}
return sb.IsNull(fieldExpression), nil
case schema.ColumnTypeEnumLowCardinality:
switch elementType := column.Type.(schema.LowCardinalityColumnType).ElementType; elementType.GetType() {
case schema.ColumnTypeEnumString:
value = ""
if operator == qbtypes.FilterOperatorExists {
return sb.NE(fieldExpression, value), nil
}
return sb.E(fieldExpression, value), nil
default:
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "exists operator is not supported for low cardinality column type %s", elementType)
}
case schema.ColumnTypeEnumString:
value = ""
if operator == qbtypes.FilterOperatorExists {
return sb.NE(fieldExpression, value), nil
}
return sb.E(fieldExpression, value), nil
case schema.ColumnTypeEnumUInt64, schema.ColumnTypeEnumUInt32, schema.ColumnTypeEnumUInt8:
value = 0
if operator == qbtypes.FilterOperatorExists {
return sb.NE(fieldExpression, value), nil
}
return sb.E(fieldExpression, value), nil
case schema.ColumnTypeEnumMap:
keyType := column.Type.(schema.MapColumnType).KeyType
if _, ok := keyType.(schema.LowCardinalityColumnType); !ok {
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "key type %s is not supported for map column type %s", keyType, column.Type)
}
switch valueType := column.Type.(schema.MapColumnType).ValueType; valueType.GetType() {
case schema.ColumnTypeEnumString, schema.ColumnTypeEnumBool, schema.ColumnTypeEnumFloat64:
leftOperand := fmt.Sprintf("mapContains(%s, '%s')", column.Name, key.Name)
if key.Materialized {
leftOperand = telemetrytypes.FieldKeyToMaterializedColumnNameForExists(key)
}
if operator == qbtypes.FilterOperatorExists {
return sb.E(leftOperand, true), nil
}
return sb.NE(leftOperand, true), nil
default:
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "exists operator is not supported for map column type %s", valueType)
}
default:
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "exists operator is not supported for column type %s", column.Type)
}
}
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "unsupported operator: %v", operator)
}
func (c *conditionBuilder) ConditionFor(
ctx context.Context,
startNs uint64,
endNs uint64,
key *telemetrytypes.TelemetryFieldKey,
operator qbtypes.FilterOperator,
value any,
sb *sqlbuilder.SelectBuilder,
) (string, error) {
condition, err := c.conditionFor(ctx, startNs, endNs, key, operator, value, sb)
if err != nil {
return "", err
}
if key.FieldContext == telemetrytypes.FieldContextLog || key.FieldContext == telemetrytypes.FieldContextScope {
return condition, nil
}
if operator.AddDefaultExistsFilter() {
existsCondition, err := c.conditionFor(ctx, startNs, endNs, key, qbtypes.FilterOperatorExists, nil, sb)
if err != nil {
return "", err
}
return sb.And(condition, existsCondition), nil
}
return condition, nil
}

129
pkg/telemetryaudit/const.go Normal file
View File

@@ -0,0 +1,129 @@
package telemetryaudit
import (
schema "github.com/SigNoz/signoz-otel-collector/cmd/signozschemamigrator/schema_migrator"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
)
const (
// Internal Columns.
IDColumn = "id"
TimestampBucketStartColumn = "ts_bucket_start"
ResourceFingerPrintColumn = "resource_fingerprint"
// Intrinsic Columns.
TimestampColumn = "timestamp"
ObservedTimestampColumn = "observed_timestamp"
BodyColumn = "body"
EventNameColumn = "event_name"
TraceIDColumn = "trace_id"
SpanIDColumn = "span_id"
TraceFlagsColumn = "trace_flags"
SeverityTextColumn = "severity_text"
SeverityNumberColumn = "severity_number"
ScopeNameColumn = "scope_name"
ScopeVersionColumn = "scope_version"
// Contextual Columns.
AttributesStringColumn = "attributes_string"
AttributesNumberColumn = "attributes_number"
AttributesBoolColumn = "attributes_bool"
ResourceColumn = "resource"
ScopeStringColumn = "scope_string"
)
var (
DefaultFullTextColumn = &telemetrytypes.TelemetryFieldKey{
Name: "body",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
}
IntrinsicFields = map[string]telemetrytypes.TelemetryFieldKey{
"body": {
Name: "body",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
},
"trace_id": {
Name: "trace_id",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
},
"span_id": {
Name: "span_id",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
},
"trace_flags": {
Name: "trace_flags",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeNumber,
},
"severity_text": {
Name: "severity_text",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
},
"severity_number": {
Name: "severity_number",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeNumber,
},
"event_name": {
Name: "event_name",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
},
}
DefaultSortingOrder = []qbtypes.OrderBy{
{
Key: qbtypes.OrderByKey{
TelemetryFieldKey: telemetrytypes.TelemetryFieldKey{
Name: TimestampColumn,
},
},
Direction: qbtypes.OrderDirectionDesc,
},
{
Key: qbtypes.OrderByKey{
TelemetryFieldKey: telemetrytypes.TelemetryFieldKey{
Name: IDColumn,
},
},
Direction: qbtypes.OrderDirectionDesc,
},
}
)
var auditLogColumns = map[string]*schema.Column{
"ts_bucket_start": {Name: "ts_bucket_start", Type: schema.ColumnTypeUInt64},
"resource_fingerprint": {Name: "resource_fingerprint", Type: schema.ColumnTypeString},
"timestamp": {Name: "timestamp", Type: schema.ColumnTypeUInt64},
"observed_timestamp": {Name: "observed_timestamp", Type: schema.ColumnTypeUInt64},
"id": {Name: "id", Type: schema.ColumnTypeString},
"trace_id": {Name: "trace_id", Type: schema.ColumnTypeString},
"span_id": {Name: "span_id", Type: schema.ColumnTypeString},
"trace_flags": {Name: "trace_flags", Type: schema.ColumnTypeUInt32},
"severity_text": {Name: "severity_text", Type: schema.LowCardinalityColumnType{ElementType: schema.ColumnTypeString}},
"severity_number": {Name: "severity_number", Type: schema.ColumnTypeUInt8},
"body": {Name: "body", Type: schema.ColumnTypeString},
"attributes_string": {Name: "attributes_string", Type: schema.MapColumnType{KeyType: schema.LowCardinalityColumnType{ElementType: schema.ColumnTypeString}, ValueType: schema.ColumnTypeString}},
"attributes_number": {Name: "attributes_number", Type: schema.MapColumnType{KeyType: schema.LowCardinalityColumnType{ElementType: schema.ColumnTypeString}, ValueType: schema.ColumnTypeFloat64}},
"attributes_bool": {Name: "attributes_bool", Type: schema.MapColumnType{KeyType: schema.LowCardinalityColumnType{ElementType: schema.ColumnTypeString}, ValueType: schema.ColumnTypeBool}},
"resource": {Name: "resource", Type: schema.JSONColumnType{}},
"event_name": {Name: "event_name", Type: schema.ColumnTypeString},
"scope_name": {Name: "scope_name", Type: schema.ColumnTypeString},
"scope_version": {Name: "scope_version", Type: schema.ColumnTypeString},
"scope_string": {Name: "scope_string", Type: schema.MapColumnType{KeyType: schema.LowCardinalityColumnType{ElementType: schema.ColumnTypeString}, ValueType: schema.ColumnTypeString}},
}

View File

@@ -0,0 +1,124 @@
package telemetryaudit
import (
"context"
"fmt"
schema "github.com/SigNoz/signoz-otel-collector/cmd/signozschemamigrator/schema_migrator"
"github.com/SigNoz/signoz/pkg/errors"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/huandu/go-sqlbuilder"
"golang.org/x/exp/maps"
)
type fieldMapper struct{}
func NewFieldMapper() qbtypes.FieldMapper {
return &fieldMapper{}
}
func (m *fieldMapper) getColumn(_ context.Context, key *telemetrytypes.TelemetryFieldKey) ([]*schema.Column, error) {
switch key.FieldContext {
case telemetrytypes.FieldContextResource:
return []*schema.Column{auditLogColumns["resource"]}, nil
case telemetrytypes.FieldContextScope:
switch key.Name {
case "name", "scope.name", "scope_name":
return []*schema.Column{auditLogColumns["scope_name"]}, nil
case "version", "scope.version", "scope_version":
return []*schema.Column{auditLogColumns["scope_version"]}, nil
}
return []*schema.Column{auditLogColumns["scope_string"]}, nil
case telemetrytypes.FieldContextAttribute:
switch key.FieldDataType {
case telemetrytypes.FieldDataTypeString:
return []*schema.Column{auditLogColumns["attributes_string"]}, nil
case telemetrytypes.FieldDataTypeInt64, telemetrytypes.FieldDataTypeFloat64, telemetrytypes.FieldDataTypeNumber:
return []*schema.Column{auditLogColumns["attributes_number"]}, nil
case telemetrytypes.FieldDataTypeBool:
return []*schema.Column{auditLogColumns["attributes_bool"]}, nil
}
case telemetrytypes.FieldContextLog, telemetrytypes.FieldContextUnspecified:
col, ok := auditLogColumns[key.Name]
if !ok {
return nil, qbtypes.ErrColumnNotFound
}
return []*schema.Column{col}, nil
}
return nil, qbtypes.ErrColumnNotFound
}
func (m *fieldMapper) FieldFor(ctx context.Context, _, _ uint64, key *telemetrytypes.TelemetryFieldKey) (string, error) {
columns, err := m.getColumn(ctx, key)
if err != nil {
return "", err
}
if len(columns) != 1 {
return "", errors.Newf(errors.TypeInternal, errors.CodeInternal, "expected exactly 1 column, got %d", len(columns))
}
column := columns[0]
switch column.Type.GetType() {
case schema.ColumnTypeEnumJSON:
if key.FieldContext != telemetrytypes.FieldContextResource {
return "", errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "only resource context fields are supported for json columns in audit, got %s", key.FieldContext.String)
}
return fmt.Sprintf("%s.`%s`::String", column.Name, key.Name), nil
case schema.ColumnTypeEnumLowCardinality:
return column.Name, nil
case schema.ColumnTypeEnumString, schema.ColumnTypeEnumUInt64, schema.ColumnTypeEnumUInt32, schema.ColumnTypeEnumUInt8:
return column.Name, nil
case schema.ColumnTypeEnumMap:
keyType := column.Type.(schema.MapColumnType).KeyType
if _, ok := keyType.(schema.LowCardinalityColumnType); !ok {
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "key type %s is not supported for map column type %s", keyType, column.Type)
}
switch valueType := column.Type.(schema.MapColumnType).ValueType; valueType.GetType() {
case schema.ColumnTypeEnumString, schema.ColumnTypeEnumBool, schema.ColumnTypeEnumFloat64:
if key.Materialized {
return telemetrytypes.FieldKeyToMaterializedColumnName(key), nil
}
return fmt.Sprintf("%s['%s']", column.Name, key.Name), nil
default:
return "", errors.NewInvalidInputf(errors.CodeInvalidInput, "unsupported map value type %s", valueType)
}
}
return column.Name, nil
}
func (m *fieldMapper) ColumnFor(ctx context.Context, _, _ uint64, key *telemetrytypes.TelemetryFieldKey) ([]*schema.Column, error) {
return m.getColumn(ctx, key)
}
func (m *fieldMapper) ColumnExpressionFor(
ctx context.Context,
tsStart, tsEnd uint64,
field *telemetrytypes.TelemetryFieldKey,
keys map[string][]*telemetrytypes.TelemetryFieldKey,
) (string, error) {
fieldExpression, err := m.FieldFor(ctx, tsStart, tsEnd, field)
if errors.Is(err, qbtypes.ErrColumnNotFound) {
keysForField := keys[field.Name]
if len(keysForField) == 0 {
if _, ok := auditLogColumns[field.Name]; ok {
field.FieldContext = telemetrytypes.FieldContextLog
fieldExpression, _ = m.FieldFor(ctx, tsStart, tsEnd, field)
} else {
correction, found := telemetrytypes.SuggestCorrection(field.Name, maps.Keys(keys))
if found {
return "", errors.Wrap(err, errors.TypeInvalidInput, errors.CodeInvalidInput, correction)
}
return "", errors.Wrapf(err, errors.TypeInvalidInput, errors.CodeInvalidInput, "field `%s` not found", field.Name)
}
} else {
fieldExpression, _ = m.FieldFor(ctx, tsStart, tsEnd, keysForField[0])
}
}
return fmt.Sprintf("%s AS `%s`", sqlbuilder.Escape(fieldExpression), field.Name), nil
}

View File

@@ -0,0 +1,612 @@
package telemetryaudit
import (
"context"
"fmt"
"log/slog"
"strings"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/querybuilder"
"github.com/SigNoz/signoz/pkg/telemetryresourcefilter"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/huandu/go-sqlbuilder"
)
type auditQueryStatementBuilder struct {
logger *slog.Logger
metadataStore telemetrytypes.MetadataStore
fm qbtypes.FieldMapper
cb qbtypes.ConditionBuilder
resourceFilterStmtBuilder qbtypes.StatementBuilder[qbtypes.LogAggregation]
aggExprRewriter qbtypes.AggExprRewriter
fullTextColumn *telemetrytypes.TelemetryFieldKey
jsonKeyToKey qbtypes.JsonKeyToFieldFunc
}
var _ qbtypes.StatementBuilder[qbtypes.LogAggregation] = (*auditQueryStatementBuilder)(nil)
func NewAuditQueryStatementBuilder(
settings factory.ProviderSettings,
metadataStore telemetrytypes.MetadataStore,
fieldMapper qbtypes.FieldMapper,
conditionBuilder qbtypes.ConditionBuilder,
aggExprRewriter qbtypes.AggExprRewriter,
fullTextColumn *telemetrytypes.TelemetryFieldKey,
jsonKeyToKey qbtypes.JsonKeyToFieldFunc,
) *auditQueryStatementBuilder {
auditSettings := factory.NewScopedProviderSettings(settings, "github.com/SigNoz/signoz/pkg/telemetryaudit")
resourceFilterStmtBuilder := telemetryresourcefilter.New[qbtypes.LogAggregation](
settings,
DBName,
LogsResourceTableName,
telemetrytypes.SignalLogs,
telemetrytypes.SourceAudit,
metadataStore,
fullTextColumn,
jsonKeyToKey,
)
return &auditQueryStatementBuilder{
logger: auditSettings.Logger(),
metadataStore: metadataStore,
fm: fieldMapper,
cb: conditionBuilder,
resourceFilterStmtBuilder: resourceFilterStmtBuilder,
aggExprRewriter: aggExprRewriter,
fullTextColumn: fullTextColumn,
jsonKeyToKey: jsonKeyToKey,
}
}
func (b *auditQueryStatementBuilder) Build(
ctx context.Context,
start uint64,
end uint64,
requestType qbtypes.RequestType,
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation],
variables map[string]qbtypes.VariableItem,
) (*qbtypes.Statement, error) {
start = querybuilder.ToNanoSecs(start)
end = querybuilder.ToNanoSecs(end)
keySelectors := getKeySelectors(query)
keys, _, err := b.metadataStore.GetKeysMulti(ctx, keySelectors)
if err != nil {
return nil, err
}
query = b.adjustKeys(ctx, keys, query, requestType)
q := sqlbuilder.NewSelectBuilder()
var stmt *qbtypes.Statement
switch requestType {
case qbtypes.RequestTypeRaw, qbtypes.RequestTypeRawStream:
stmt, err = b.buildListQuery(ctx, q, query, start, end, keys, variables)
case qbtypes.RequestTypeTimeSeries:
stmt, err = b.buildTimeSeriesQuery(ctx, q, query, start, end, keys, variables)
case qbtypes.RequestTypeScalar:
stmt, err = b.buildScalarQuery(ctx, q, query, start, end, keys, false, variables)
default:
return nil, errors.NewInvalidInputf(errors.CodeInvalidInput, "unsupported request type: %s", requestType)
}
if err != nil {
return nil, err
}
return stmt, nil
}
func getKeySelectors(query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]) []*telemetrytypes.FieldKeySelector {
var keySelectors []*telemetrytypes.FieldKeySelector
for idx := range query.Aggregations {
aggExpr := query.Aggregations[idx]
selectors := querybuilder.QueryStringToKeysSelectors(aggExpr.Expression)
keySelectors = append(keySelectors, selectors...)
}
if query.Filter != nil && query.Filter.Expression != "" {
whereClauseSelectors := querybuilder.QueryStringToKeysSelectors(query.Filter.Expression)
keySelectors = append(keySelectors, whereClauseSelectors...)
}
for idx := range query.GroupBy {
groupBy := query.GroupBy[idx]
keySelectors = append(keySelectors, &telemetrytypes.FieldKeySelector{
Name: groupBy.Name,
Signal: telemetrytypes.SignalLogs,
FieldContext: groupBy.FieldContext,
FieldDataType: groupBy.FieldDataType,
})
}
for idx := range query.SelectFields {
selectField := query.SelectFields[idx]
keySelectors = append(keySelectors, &telemetrytypes.FieldKeySelector{
Name: selectField.Name,
Signal: telemetrytypes.SignalLogs,
FieldContext: selectField.FieldContext,
FieldDataType: selectField.FieldDataType,
})
}
for idx := range query.Order {
keySelectors = append(keySelectors, &telemetrytypes.FieldKeySelector{
Name: query.Order[idx].Key.Name,
Signal: telemetrytypes.SignalLogs,
FieldContext: query.Order[idx].Key.FieldContext,
FieldDataType: query.Order[idx].Key.FieldDataType,
})
}
for idx := range keySelectors {
keySelectors[idx].Signal = telemetrytypes.SignalLogs
keySelectors[idx].Source = telemetrytypes.SourceAudit
keySelectors[idx].SelectorMatchType = telemetrytypes.FieldSelectorMatchTypeExact
}
return keySelectors
}
func (b *auditQueryStatementBuilder) adjustKeys(ctx context.Context, keys map[string][]*telemetrytypes.TelemetryFieldKey, query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation], requestType qbtypes.RequestType) qbtypes.QueryBuilderQuery[qbtypes.LogAggregation] {
keys["id"] = append([]*telemetrytypes.TelemetryFieldKey{{
Name: "id",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeString,
}}, keys["id"]...)
keys["timestamp"] = append([]*telemetrytypes.TelemetryFieldKey{{
Name: "timestamp",
Signal: telemetrytypes.SignalLogs,
FieldContext: telemetrytypes.FieldContextLog,
FieldDataType: telemetrytypes.FieldDataTypeNumber,
}}, keys["timestamp"]...)
actions := querybuilder.AdjustKeysForAliasExpressions(&query, requestType)
actions = append(actions, querybuilder.AdjustDuplicateKeys(&query)...)
for idx := range query.SelectFields {
actions = append(actions, b.adjustKey(&query.SelectFields[idx], keys)...)
}
for idx := range query.GroupBy {
actions = append(actions, b.adjustKey(&query.GroupBy[idx].TelemetryFieldKey, keys)...)
}
for idx := range query.Order {
actions = append(actions, b.adjustKey(&query.Order[idx].Key.TelemetryFieldKey, keys)...)
}
for _, action := range actions {
b.logger.InfoContext(ctx, "key adjustment action", slog.String("action", action))
}
return query
}
func (b *auditQueryStatementBuilder) adjustKey(key *telemetrytypes.TelemetryFieldKey, keys map[string][]*telemetrytypes.TelemetryFieldKey) []string {
if _, ok := IntrinsicFields[key.Name]; ok {
intrinsicField := IntrinsicFields[key.Name]
return querybuilder.AdjustKey(key, keys, &intrinsicField)
}
return querybuilder.AdjustKey(key, keys, nil)
}
func (b *auditQueryStatementBuilder) buildListQuery(
ctx context.Context,
sb *sqlbuilder.SelectBuilder,
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation],
start, end uint64,
keys map[string][]*telemetrytypes.TelemetryFieldKey,
variables map[string]qbtypes.VariableItem,
) (*qbtypes.Statement, error) {
var (
cteFragments []string
cteArgs [][]any
)
if frag, args, err := b.maybeAttachResourceFilter(ctx, sb, query, start, end, variables); err != nil {
return nil, err
} else if frag != "" {
cteFragments = append(cteFragments, frag)
cteArgs = append(cteArgs, args)
}
sb.Select(TimestampColumn)
sb.SelectMore(IDColumn)
if len(query.SelectFields) == 0 {
sb.SelectMore(TraceIDColumn)
sb.SelectMore(SpanIDColumn)
sb.SelectMore(TraceFlagsColumn)
sb.SelectMore(SeverityTextColumn)
sb.SelectMore(SeverityNumberColumn)
sb.SelectMore(ScopeNameColumn)
sb.SelectMore(ScopeVersionColumn)
sb.SelectMore(BodyColumn)
sb.SelectMore(EventNameColumn)
sb.SelectMore(AttributesStringColumn)
sb.SelectMore(AttributesNumberColumn)
sb.SelectMore(AttributesBoolColumn)
sb.SelectMore(ResourceColumn)
sb.SelectMore(ScopeStringColumn)
} else {
for index := range query.SelectFields {
if query.SelectFields[index].Name == TimestampColumn || query.SelectFields[index].Name == IDColumn {
continue
}
colExpr, err := b.fm.ColumnExpressionFor(ctx, start, end, &query.SelectFields[index], keys)
if err != nil {
return nil, err
}
sb.SelectMore(colExpr)
}
}
sb.From(fmt.Sprintf("%s.%s", DBName, AuditLogsTableName))
preparedWhereClause, err := b.addFilterCondition(ctx, sb, start, end, query, keys, variables)
if err != nil {
return nil, err
}
for _, orderBy := range query.Order {
colExpr, err := b.fm.ColumnExpressionFor(ctx, start, end, &orderBy.Key.TelemetryFieldKey, keys)
if err != nil {
return nil, err
}
sb.OrderBy(fmt.Sprintf("%s %s", colExpr, orderBy.Direction.StringValue()))
}
if query.Limit > 0 {
sb.Limit(query.Limit)
} else {
sb.Limit(100)
}
if query.Offset > 0 {
sb.Offset(query.Offset)
}
mainSQL, mainArgs := sb.BuildWithFlavor(sqlbuilder.ClickHouse)
finalSQL := querybuilder.CombineCTEs(cteFragments) + mainSQL
finalArgs := querybuilder.PrependArgs(cteArgs, mainArgs)
stmt := &qbtypes.Statement{
Query: finalSQL,
Args: finalArgs,
}
if preparedWhereClause != nil {
stmt.Warnings = preparedWhereClause.Warnings
stmt.WarningsDocURL = preparedWhereClause.WarningsDocURL
}
return stmt, nil
}
func (b *auditQueryStatementBuilder) buildTimeSeriesQuery(
ctx context.Context,
sb *sqlbuilder.SelectBuilder,
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation],
start, end uint64,
keys map[string][]*telemetrytypes.TelemetryFieldKey,
variables map[string]qbtypes.VariableItem,
) (*qbtypes.Statement, error) {
var (
cteFragments []string
cteArgs [][]any
)
if frag, args, err := b.maybeAttachResourceFilter(ctx, sb, query, start, end, variables); err != nil {
return nil, err
} else if frag != "" {
cteFragments = append(cteFragments, frag)
cteArgs = append(cteArgs, args)
}
sb.SelectMore(fmt.Sprintf(
"toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL %d SECOND) AS ts",
int64(query.StepInterval.Seconds()),
))
var allGroupByArgs []any
fieldNames := make([]string, 0, len(query.GroupBy))
for _, gb := range query.GroupBy {
expr, args, err := querybuilder.CollisionHandledFinalExpr(ctx, start, end, &gb.TelemetryFieldKey, b.fm, b.cb, keys, telemetrytypes.FieldDataTypeString, b.jsonKeyToKey)
if err != nil {
return nil, err
}
colExpr := fmt.Sprintf("toString(%s) AS `%s`", expr, gb.Name)
allGroupByArgs = append(allGroupByArgs, args...)
sb.SelectMore(colExpr)
fieldNames = append(fieldNames, fmt.Sprintf("`%s`", gb.Name))
}
allAggChArgs := make([]any, 0)
for i, agg := range query.Aggregations {
rewritten, chArgs, err := b.aggExprRewriter.Rewrite(ctx, start, end, agg.Expression, uint64(query.StepInterval.Seconds()), keys)
if err != nil {
return nil, err
}
allAggChArgs = append(allAggChArgs, chArgs...)
sb.SelectMore(fmt.Sprintf("%s AS __result_%d", rewritten, i))
}
sb.From(fmt.Sprintf("%s.%s", DBName, AuditLogsTableName))
preparedWhereClause, err := b.addFilterCondition(ctx, sb, start, end, query, keys, variables)
if err != nil {
return nil, err
}
var finalSQL string
var finalArgs []any
if query.Limit > 0 && len(query.GroupBy) > 0 {
cteSB := sqlbuilder.NewSelectBuilder()
cteStmt, err := b.buildScalarQuery(ctx, cteSB, query, start, end, keys, true, variables)
if err != nil {
return nil, err
}
cteFragments = append(cteFragments, fmt.Sprintf("__limit_cte AS (%s)", cteStmt.Query))
cteArgs = append(cteArgs, cteStmt.Args)
tuple := fmt.Sprintf("(%s)", strings.Join(fieldNames, ", "))
sb.Where(fmt.Sprintf("%s GLOBAL IN (SELECT %s FROM __limit_cte)", tuple, strings.Join(fieldNames, ", ")))
sb.GroupBy("ts")
sb.GroupBy(querybuilder.GroupByKeys(query.GroupBy)...)
if query.Having != nil && query.Having.Expression != "" {
rewriter := querybuilder.NewHavingExpressionRewriter()
rewrittenExpr, err := rewriter.RewriteForLogs(query.Having.Expression, query.Aggregations)
if err != nil {
return nil, err
}
sb.Having(rewrittenExpr)
}
if len(query.Order) != 0 {
for _, orderBy := range query.Order {
_, ok := aggOrderBy(orderBy, query)
if !ok {
sb.OrderBy(fmt.Sprintf("`%s` %s", orderBy.Key.Name, orderBy.Direction.StringValue()))
}
}
sb.OrderBy("ts desc")
}
combinedArgs := append(allGroupByArgs, allAggChArgs...)
mainSQL, mainArgs := sb.BuildWithFlavor(sqlbuilder.ClickHouse, combinedArgs...)
finalSQL = querybuilder.CombineCTEs(cteFragments) + mainSQL
finalArgs = querybuilder.PrependArgs(cteArgs, mainArgs)
} else {
sb.GroupBy("ts")
sb.GroupBy(querybuilder.GroupByKeys(query.GroupBy)...)
if query.Having != nil && query.Having.Expression != "" {
rewriter := querybuilder.NewHavingExpressionRewriter()
rewrittenExpr, err := rewriter.RewriteForLogs(query.Having.Expression, query.Aggregations)
if err != nil {
return nil, err
}
sb.Having(rewrittenExpr)
}
if len(query.Order) != 0 {
for _, orderBy := range query.Order {
_, ok := aggOrderBy(orderBy, query)
if !ok {
sb.OrderBy(fmt.Sprintf("`%s` %s", orderBy.Key.Name, orderBy.Direction.StringValue()))
}
}
sb.OrderBy("ts desc")
}
combinedArgs := append(allGroupByArgs, allAggChArgs...)
mainSQL, mainArgs := sb.BuildWithFlavor(sqlbuilder.ClickHouse, combinedArgs...)
finalSQL = querybuilder.CombineCTEs(cteFragments) + mainSQL
finalArgs = querybuilder.PrependArgs(cteArgs, mainArgs)
}
stmt := &qbtypes.Statement{
Query: finalSQL,
Args: finalArgs,
}
if preparedWhereClause != nil {
stmt.Warnings = preparedWhereClause.Warnings
stmt.WarningsDocURL = preparedWhereClause.WarningsDocURL
}
return stmt, nil
}
func (b *auditQueryStatementBuilder) buildScalarQuery(
ctx context.Context,
sb *sqlbuilder.SelectBuilder,
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation],
start, end uint64,
keys map[string][]*telemetrytypes.TelemetryFieldKey,
skipResourceCTE bool,
variables map[string]qbtypes.VariableItem,
) (*qbtypes.Statement, error) {
var (
cteFragments []string
cteArgs [][]any
)
if frag, args, err := b.maybeAttachResourceFilter(ctx, sb, query, start, end, variables); err != nil {
return nil, err
} else if frag != "" && !skipResourceCTE {
cteFragments = append(cteFragments, frag)
cteArgs = append(cteArgs, args)
}
allAggChArgs := []any{}
var allGroupByArgs []any
for _, gb := range query.GroupBy {
expr, args, err := querybuilder.CollisionHandledFinalExpr(ctx, start, end, &gb.TelemetryFieldKey, b.fm, b.cb, keys, telemetrytypes.FieldDataTypeString, b.jsonKeyToKey)
if err != nil {
return nil, err
}
colExpr := fmt.Sprintf("toString(%s) AS `%s`", expr, gb.Name)
allGroupByArgs = append(allGroupByArgs, args...)
sb.SelectMore(colExpr)
}
rateInterval := (end - start) / querybuilder.NsToSeconds
if len(query.Aggregations) > 0 {
for idx := range query.Aggregations {
aggExpr := query.Aggregations[idx]
rewritten, chArgs, err := b.aggExprRewriter.Rewrite(ctx, start, end, aggExpr.Expression, rateInterval, keys)
if err != nil {
return nil, err
}
allAggChArgs = append(allAggChArgs, chArgs...)
sb.SelectMore(fmt.Sprintf("%s AS __result_%d", rewritten, idx))
}
}
sb.From(fmt.Sprintf("%s.%s", DBName, AuditLogsTableName))
preparedWhereClause, err := b.addFilterCondition(ctx, sb, start, end, query, keys, variables)
if err != nil {
return nil, err
}
sb.GroupBy(querybuilder.GroupByKeys(query.GroupBy)...)
if query.Having != nil && query.Having.Expression != "" {
rewriter := querybuilder.NewHavingExpressionRewriter()
rewrittenExpr, err := rewriter.RewriteForLogs(query.Having.Expression, query.Aggregations)
if err != nil {
return nil, err
}
sb.Having(rewrittenExpr)
}
for _, orderBy := range query.Order {
idx, ok := aggOrderBy(orderBy, query)
if ok {
sb.OrderBy(fmt.Sprintf("__result_%d %s", idx, orderBy.Direction.StringValue()))
} else {
sb.OrderBy(fmt.Sprintf("`%s` %s", orderBy.Key.Name, orderBy.Direction.StringValue()))
}
}
if len(query.Order) == 0 {
sb.OrderBy("__result_0 DESC")
}
if query.Limit > 0 {
sb.Limit(query.Limit)
}
combinedArgs := append(allGroupByArgs, allAggChArgs...)
mainSQL, mainArgs := sb.BuildWithFlavor(sqlbuilder.ClickHouse, combinedArgs...)
finalSQL := querybuilder.CombineCTEs(cteFragments) + mainSQL
finalArgs := querybuilder.PrependArgs(cteArgs, mainArgs)
stmt := &qbtypes.Statement{
Query: finalSQL,
Args: finalArgs,
}
if preparedWhereClause != nil {
stmt.Warnings = preparedWhereClause.Warnings
stmt.WarningsDocURL = preparedWhereClause.WarningsDocURL
}
return stmt, nil
}
func (b *auditQueryStatementBuilder) addFilterCondition(
ctx context.Context,
sb *sqlbuilder.SelectBuilder,
start, end uint64,
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation],
keys map[string][]*telemetrytypes.TelemetryFieldKey,
variables map[string]qbtypes.VariableItem,
) (*querybuilder.PreparedWhereClause, error) {
var preparedWhereClause *querybuilder.PreparedWhereClause
var err error
if query.Filter != nil && query.Filter.Expression != "" {
preparedWhereClause, err = querybuilder.PrepareWhereClause(query.Filter.Expression, querybuilder.FilterExprVisitorOpts{
Context: ctx,
Logger: b.logger,
FieldMapper: b.fm,
ConditionBuilder: b.cb,
FieldKeys: keys,
SkipResourceFilter: true,
FullTextColumn: b.fullTextColumn,
JsonKeyToKey: b.jsonKeyToKey,
Variables: variables,
StartNs: start,
EndNs: end,
})
if err != nil {
return nil, err
}
}
if preparedWhereClause != nil {
sb.AddWhereClause(preparedWhereClause.WhereClause)
}
startBucket := start/querybuilder.NsToSeconds - querybuilder.BucketAdjustment
var endBucket uint64
if end != 0 {
endBucket = end / querybuilder.NsToSeconds
}
if start != 0 {
sb.Where(sb.GE("timestamp", fmt.Sprintf("%d", start)), sb.GE("ts_bucket_start", startBucket))
}
if end != 0 {
sb.Where(sb.L("timestamp", fmt.Sprintf("%d", end)), sb.LE("ts_bucket_start", endBucket))
}
return preparedWhereClause, nil
}
func aggOrderBy(k qbtypes.OrderBy, q qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]) (int, bool) {
for i, agg := range q.Aggregations {
if k.Key.Name == agg.Alias || k.Key.Name == agg.Expression || k.Key.Name == fmt.Sprintf("%d", i) {
return i, true
}
}
return 0, false
}
func (b *auditQueryStatementBuilder) maybeAttachResourceFilter(
ctx context.Context,
sb *sqlbuilder.SelectBuilder,
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation],
start, end uint64,
variables map[string]qbtypes.VariableItem,
) (cteSQL string, cteArgs []any, err error) {
stmt, err := b.resourceFilterStmtBuilder.Build(ctx, start, end, qbtypes.RequestTypeRaw, query, variables)
if err != nil {
return "", nil, err
}
sb.Where("resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter)")
return fmt.Sprintf("__resource_filter AS (%s)", stmt.Query), stmt.Args, nil
}

View File

@@ -0,0 +1,223 @@
package telemetryaudit
import (
"context"
"testing"
"time"
"github.com/SigNoz/signoz/pkg/instrumentation/instrumentationtest"
"github.com/SigNoz/signoz/pkg/querybuilder"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes/telemetrytypestest"
"github.com/stretchr/testify/require"
)
func auditFieldKeyMap() map[string][]*telemetrytypes.TelemetryFieldKey {
key := func(name string, ctx telemetrytypes.FieldContext, dt telemetrytypes.FieldDataType, materialized bool) *telemetrytypes.TelemetryFieldKey {
return &telemetrytypes.TelemetryFieldKey{
Name: name,
Signal: telemetrytypes.SignalLogs,
FieldContext: ctx,
FieldDataType: dt,
Materialized: materialized,
}
}
attr := telemetrytypes.FieldContextAttribute
res := telemetrytypes.FieldContextResource
str := telemetrytypes.FieldDataTypeString
i64 := telemetrytypes.FieldDataTypeInt64
return map[string][]*telemetrytypes.TelemetryFieldKey{
"service.name": {key("service.name", res, str, false)},
"signoz.audit.action": {key("signoz.audit.action", attr, str, true)},
"signoz.audit.outcome": {key("signoz.audit.outcome", attr, str, true)},
"signoz.audit.principal.email": {key("signoz.audit.principal.email", attr, str, true)},
"signoz.audit.principal.id": {key("signoz.audit.principal.id", attr, str, true)},
"signoz.audit.principal.type": {key("signoz.audit.principal.type", attr, str, true)},
"signoz.audit.resource.kind": {key("signoz.audit.resource.kind", res, str, false)},
"signoz.audit.resource.id": {key("signoz.audit.resource.id", res, str, false)},
"signoz.audit.action_category": {key("signoz.audit.action_category", attr, str, false)},
"signoz.audit.error.type": {key("signoz.audit.error.type", attr, str, false)},
"signoz.audit.error.code": {key("signoz.audit.error.code", attr, str, false)},
"http.request.method": {key("http.request.method", attr, str, false)},
"http.response.status_code": {key("http.response.status_code", attr, i64, false)},
}
}
func newTestAuditStatementBuilder() *auditQueryStatementBuilder {
mockMetadataStore := telemetrytypestest.NewMockMetadataStore()
mockMetadataStore.KeysMap = auditFieldKeyMap()
fm := NewFieldMapper()
cb := NewConditionBuilder(fm)
aggExprRewriter := querybuilder.NewAggExprRewriter(instrumentationtest.New().ToProviderSettings(), nil, fm, cb, nil)
return NewAuditQueryStatementBuilder(
instrumentationtest.New().ToProviderSettings(),
mockMetadataStore,
fm,
cb,
aggExprRewriter,
DefaultFullTextColumn,
nil,
)
}
func TestStatementBuilder(t *testing.T) {
statementBuilder := newTestAuditStatementBuilder()
ctx := context.Background()
testCases := []struct {
name string
requestType qbtypes.RequestType
query qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]
expected qbtypes.Statement
expectedErr error
}{
// List: all actions by a specific user (materialized principal.id filter)
{
name: "ListByPrincipalID",
requestType: qbtypes.RequestTypeRaw,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
Filter: &qbtypes.Filter{
Expression: "signoz.audit.principal.id = '019a-1234-abcd-5678'",
},
Limit: 100,
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE true AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?) SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, scope_name, scope_version, body, event_name, attributes_string, attributes_number, attributes_bool, resource, scope_string FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND (`attribute_string_signoz$$audit$$principal$$id` = ? AND `attribute_string_signoz$$audit$$principal$$id_exists` = ?) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? LIMIT ?",
Args: []any{uint64(1747945619), uint64(1747983448), "019a-1234-abcd-5678", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448), 100},
},
},
// List: all failed actions (materialized outcome filter)
{
name: "ListByOutcomeFailure",
requestType: qbtypes.RequestTypeRaw,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
Filter: &qbtypes.Filter{
Expression: "signoz.audit.outcome = 'failure'",
},
Limit: 100,
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE true AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?) SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, scope_name, scope_version, body, event_name, attributes_string, attributes_number, attributes_bool, resource, scope_string FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND (`attribute_string_signoz$$audit$$outcome` = ? AND `attribute_string_signoz$$audit$$outcome_exists` = ?) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? LIMIT ?",
Args: []any{uint64(1747945619), uint64(1747983448), "failure", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448), 100},
},
},
// List: change history of a specific dashboard (two materialized column AND)
{
name: "ListByResourceKindAndID",
requestType: qbtypes.RequestTypeRaw,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
Filter: &qbtypes.Filter{
Expression: "signoz.audit.resource.kind = 'dashboard' AND signoz.audit.resource.id = '019b-5678-efgh-9012'",
},
Limit: 100,
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE ((simpleJSONExtractString(labels, 'signoz.audit.resource.kind') = ? AND labels LIKE ? AND labels LIKE ?) AND (simpleJSONExtractString(labels, 'signoz.audit.resource.id') = ? AND labels LIKE ? AND labels LIKE ?)) AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?) SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, scope_name, scope_version, body, event_name, attributes_string, attributes_number, attributes_bool, resource, scope_string FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND true AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? LIMIT ?",
Args: []any{"dashboard", "%signoz.audit.resource.kind%", "%signoz.audit.resource.kind\":\"dashboard%", "019b-5678-efgh-9012", "%signoz.audit.resource.id%", "%signoz.audit.resource.id\":\"019b-5678-efgh-9012%", uint64(1747945619), uint64(1747983448), "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448), 100},
},
},
// List: all dashboard deletions (compliance — resource.kind + action AND)
{
name: "ListByResourceKindAndAction",
requestType: qbtypes.RequestTypeRaw,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
Filter: &qbtypes.Filter{
Expression: "signoz.audit.resource.kind = 'dashboard' AND signoz.audit.action = 'delete'",
},
Limit: 100,
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE (simpleJSONExtractString(labels, 'signoz.audit.resource.kind') = ? AND labels LIKE ? AND labels LIKE ?) AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?) SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, scope_name, scope_version, body, event_name, attributes_string, attributes_number, attributes_bool, resource, scope_string FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND (`attribute_string_signoz$$audit$$action` = ? AND `attribute_string_signoz$$audit$$action_exists` = ?) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? LIMIT ?",
Args: []any{"dashboard", "%signoz.audit.resource.kind%", "%signoz.audit.resource.kind\":\"dashboard%", uint64(1747945619), uint64(1747983448), "delete", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448), 100},
},
},
// List: all actions by service accounts (materialized principal.type)
{
name: "ListByPrincipalType",
requestType: qbtypes.RequestTypeRaw,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
Filter: &qbtypes.Filter{
Expression: "signoz.audit.principal.type = 'service_account'",
},
Limit: 100,
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE true AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?) SELECT timestamp, id, trace_id, span_id, trace_flags, severity_text, severity_number, scope_name, scope_version, body, event_name, attributes_string, attributes_number, attributes_bool, resource, scope_string FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND (`attribute_string_signoz$$audit$$principal$$type` = ? AND `attribute_string_signoz$$audit$$principal$$type_exists` = ?) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? LIMIT ?",
Args: []any{uint64(1747945619), uint64(1747983448), "service_account", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448), 100},
},
},
// Scalar: alert — count forbidden errors (outcome + action AND)
{
name: "ScalarCountByOutcomeAndAction",
requestType: qbtypes.RequestTypeScalar,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
StepInterval: qbtypes.Step{Duration: 60 * time.Second},
Filter: &qbtypes.Filter{
Expression: "signoz.audit.outcome = 'failure' AND signoz.audit.action = 'update'",
},
Aggregations: []qbtypes.LogAggregation{
{Expression: "count()"},
},
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE true AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?) SELECT count() AS __result_0 FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND ((`attribute_string_signoz$$audit$$outcome` = ? AND `attribute_string_signoz$$audit$$outcome_exists` = ?) AND (`attribute_string_signoz$$audit$$action` = ? AND `attribute_string_signoz$$audit$$action_exists` = ?)) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? ORDER BY __result_0 DESC",
Args: []any{uint64(1747945619), uint64(1747983448), "failure", true, "update", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448)},
},
},
// TimeSeries: failures grouped by principal email with top-N limit
{
name: "TimeSeriesFailuresGroupedByPrincipal",
requestType: qbtypes.RequestTypeTimeSeries,
query: qbtypes.QueryBuilderQuery[qbtypes.LogAggregation]{
Signal: telemetrytypes.SignalLogs,
Source: telemetrytypes.SourceAudit,
StepInterval: qbtypes.Step{Duration: 60 * time.Second},
Aggregations: []qbtypes.LogAggregation{
{Expression: "count()"},
},
Filter: &qbtypes.Filter{
Expression: "signoz.audit.outcome = 'failure'",
},
GroupBy: []qbtypes.GroupByKey{
{TelemetryFieldKey: telemetrytypes.TelemetryFieldKey{Name: "signoz.audit.principal.email"}},
},
Limit: 5,
},
expected: qbtypes.Statement{
Query: "WITH __resource_filter AS (SELECT fingerprint FROM signoz_audit.distributed_logs_resource WHERE true AND seen_at_ts_bucket_start >= ? AND seen_at_ts_bucket_start <= ?), __limit_cte AS (SELECT toString(multiIf(`attribute_string_signoz$$audit$$principal$$email_exists` = ?, `attribute_string_signoz$$audit$$principal$$email`, NULL)) AS `signoz.audit.principal.email`, count() AS __result_0 FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND (`attribute_string_signoz$$audit$$outcome` = ? AND `attribute_string_signoz$$audit$$outcome_exists` = ?) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? GROUP BY `signoz.audit.principal.email` ORDER BY __result_0 DESC LIMIT ?) SELECT toStartOfInterval(fromUnixTimestamp64Nano(timestamp), INTERVAL 60 SECOND) AS ts, toString(multiIf(`attribute_string_signoz$$audit$$principal$$email_exists` = ?, `attribute_string_signoz$$audit$$principal$$email`, NULL)) AS `signoz.audit.principal.email`, count() AS __result_0 FROM signoz_audit.distributed_logs WHERE resource_fingerprint GLOBAL IN (SELECT fingerprint FROM __resource_filter) AND (`attribute_string_signoz$$audit$$outcome` = ? AND `attribute_string_signoz$$audit$$outcome_exists` = ?) AND timestamp >= ? AND ts_bucket_start >= ? AND timestamp < ? AND ts_bucket_start <= ? AND (`signoz.audit.principal.email`) GLOBAL IN (SELECT `signoz.audit.principal.email` FROM __limit_cte) GROUP BY ts, `signoz.audit.principal.email`",
Args: []any{uint64(1747945619), uint64(1747983448), true, "failure", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448), 5, true, "failure", true, "1747947419000000000", uint64(1747945619), "1747983448000000000", uint64(1747983448)},
},
},
}
for _, testCase := range testCases {
t.Run(testCase.name, func(t *testing.T) {
q, err := statementBuilder.Build(ctx, 1747947419000, 1747983448000, testCase.requestType, testCase.query, nil)
if testCase.expectedErr != nil {
require.Error(t, err)
require.Contains(t, err.Error(), testCase.expectedErr.Error())
} else {
require.NoError(t, err)
require.Equal(t, testCase.expected.Query, q.Query)
require.Equal(t, testCase.expected.Args, q.Args)
}
})
}
}

View File

@@ -0,0 +1,12 @@
package telemetryaudit
const (
DBName = "signoz_audit"
AuditLogsTableName = "distributed_logs"
AuditLogsLocalTableName = "logs"
TagAttributesTableName = "distributed_tag_attributes"
TagAttributesLocalTableName = "tag_attributes"
LogAttributeKeysTblName = "distributed_logs_attribute_keys"
LogResourceKeysTblName = "distributed_logs_resource_keys"
LogsResourceTableName = "distributed_logs_resource"
)

View File

@@ -45,6 +45,7 @@ func NewLogQueryStatementBuilder(
DBName,
LogsResourceV2TableName,
telemetrytypes.SignalLogs,
telemetrytypes.SourceUnspecified,
metadataStore,
fullTextColumn,
jsonKeyToKey,

View File

@@ -13,6 +13,7 @@ import (
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/querybuilder"
"github.com/SigNoz/signoz/pkg/telemetryaudit"
"github.com/SigNoz/signoz/pkg/telemetrylogs"
"github.com/SigNoz/signoz/pkg/telemetrymetrics"
"github.com/SigNoz/signoz/pkg/telemetrystore"
@@ -27,6 +28,7 @@ import (
var (
ErrFailedToGetTracesKeys = errors.Newf(errors.TypeInternal, errors.CodeInternal, "failed to get traces keys")
ErrFailedToGetLogsKeys = errors.Newf(errors.TypeInternal, errors.CodeInternal, "failed to get logs keys")
ErrFailedToGetAuditKeys = errors.Newf(errors.TypeInternal, errors.CodeInternal, "failed to get audit keys")
ErrFailedToGetTblStatement = errors.Newf(errors.TypeInternal, errors.CodeInternal, "failed to get tbl statement")
ErrFailedToGetMetricsKeys = errors.Newf(errors.TypeInternal, errors.CodeInternal, "failed to get metrics keys")
ErrFailedToGetMeterKeys = errors.Newf(errors.TypeInternal, errors.CodeInternal, "failed to get meter keys")
@@ -50,6 +52,11 @@ type telemetryMetaStore struct {
logAttributeKeysTblName string
logResourceKeysTblName string
logsV2TblName string
auditDBName string
auditLogsTblName string
auditFieldsTblName string
auditAttributeKeysTblName string
auditResourceKeysTblName string
relatedMetadataDBName string
relatedMetadataTblName string
columnEvolutionMetadataTblName string
@@ -79,6 +86,11 @@ func NewTelemetryMetaStore(
logsFieldsTblName string,
logAttributeKeysTblName string,
logResourceKeysTblName string,
auditDBName string,
auditLogsTblName string,
auditFieldsTblName string,
auditAttributeKeysTblName string,
auditResourceKeysTblName string,
relatedMetadataDBName string,
relatedMetadataTblName string,
columnEvolutionMetadataTblName string,
@@ -101,6 +113,11 @@ func NewTelemetryMetaStore(
logsFieldsTblName: logsFieldsTblName,
logAttributeKeysTblName: logAttributeKeysTblName,
logResourceKeysTblName: logResourceKeysTblName,
auditDBName: auditDBName,
auditLogsTblName: auditLogsTblName,
auditFieldsTblName: auditFieldsTblName,
auditAttributeKeysTblName: auditAttributeKeysTblName,
auditResourceKeysTblName: auditResourceKeysTblName,
relatedMetadataDBName: relatedMetadataDBName,
relatedMetadataTblName: relatedMetadataTblName,
columnEvolutionMetadataTblName: columnEvolutionMetadataTblName,
@@ -592,6 +609,227 @@ func (t *telemetryMetaStore) getLogsKeys(ctx context.Context, fieldKeySelectors
return keys, complete, nil
}
func (t *telemetryMetaStore) auditTblStatementToFieldKeys(ctx context.Context) ([]*telemetrytypes.TelemetryFieldKey, error) {
ctx = ctxtypes.NewContextWithCommentVals(ctx, map[string]string{
instrumentationtypes.TelemetrySignal: telemetrytypes.SignalLogs.StringValue(),
instrumentationtypes.CodeNamespace: "metadata",
instrumentationtypes.CodeFunctionName: "auditTblStatementToFieldKeys",
})
query := fmt.Sprintf("SHOW CREATE TABLE %s.%s", t.auditDBName, t.auditLogsTblName)
statements := []telemetrytypes.ShowCreateTableStatement{}
err := t.telemetrystore.ClickhouseDB().Select(ctx, &statements, query)
if err != nil {
return nil, errors.Wrap(err, errors.TypeInternal, errors.CodeInternal, ErrFailedToGetTblStatement.Error())
}
materialisedKeys, err := ExtractFieldKeysFromTblStatement(statements[0].Statement)
if err != nil {
return nil, errors.Wrap(err, errors.TypeInternal, errors.CodeInternal, ErrFailedToGetAuditKeys.Error())
}
for idx := range materialisedKeys {
materialisedKeys[idx].Signal = telemetrytypes.SignalLogs
}
return materialisedKeys, nil
}
func (t *telemetryMetaStore) getAuditKeys(ctx context.Context, fieldKeySelectors []*telemetrytypes.FieldKeySelector) ([]*telemetrytypes.TelemetryFieldKey, bool, error) {
ctx = ctxtypes.NewContextWithCommentVals(ctx, map[string]string{
instrumentationtypes.TelemetrySignal: telemetrytypes.SignalLogs.StringValue(),
instrumentationtypes.CodeNamespace: "metadata",
instrumentationtypes.CodeFunctionName: "getAuditKeys",
})
if len(fieldKeySelectors) == 0 {
return nil, true, nil
}
matKeys, err := t.auditTblStatementToFieldKeys(ctx)
if err != nil {
return nil, false, err
}
mapOfKeys := make(map[string]*telemetrytypes.TelemetryFieldKey)
for _, key := range matKeys {
mapOfKeys[key.Name+";"+key.FieldContext.StringValue()+";"+key.FieldDataType.StringValue()] = key
}
var queries []string
var allArgs []any
queryAttributeTable := false
queryResourceTable := false
for _, selector := range fieldKeySelectors {
if selector.FieldContext == telemetrytypes.FieldContextUnspecified {
queryAttributeTable = true
queryResourceTable = true
break
} else if selector.FieldContext == telemetrytypes.FieldContextAttribute {
queryAttributeTable = true
} else if selector.FieldContext == telemetrytypes.FieldContextResource {
queryResourceTable = true
}
}
tablesToQuery := []struct {
fieldContext telemetrytypes.FieldContext
shouldQuery bool
tblName string
}{
{telemetrytypes.FieldContextAttribute, queryAttributeTable, t.auditDBName + "." + t.auditAttributeKeysTblName},
{telemetrytypes.FieldContextResource, queryResourceTable, t.auditDBName + "." + t.auditResourceKeysTblName},
}
for _, table := range tablesToQuery {
if !table.shouldQuery {
continue
}
fieldContext := table.fieldContext
tblName := table.tblName
sb := sqlbuilder.Select(
"name AS tag_key",
fmt.Sprintf("'%s' AS tag_type", fieldContext.TagType()),
"lower(datatype) AS tag_data_type",
fmt.Sprintf("%d AS priority", getPriorityForContext(fieldContext)),
).From(tblName)
var limit int
conds := []string{}
for _, fieldKeySelector := range fieldKeySelectors {
if fieldKeySelector.FieldContext != telemetrytypes.FieldContextUnspecified && fieldKeySelector.FieldContext != fieldContext {
continue
}
fieldKeyConds := []string{}
if fieldKeySelector.SelectorMatchType == telemetrytypes.FieldSelectorMatchTypeExact {
fieldKeyConds = append(fieldKeyConds, sb.E("name", fieldKeySelector.Name))
} else {
fieldKeyConds = append(fieldKeyConds, sb.ILike("name", "%"+escapeForLike(fieldKeySelector.Name)+"%"))
}
if fieldKeySelector.FieldDataType != telemetrytypes.FieldDataTypeUnspecified {
fieldKeyConds = append(fieldKeyConds, sb.E("datatype", fieldKeySelector.FieldDataType.TagDataType()))
}
if len(fieldKeyConds) > 0 {
conds = append(conds, sb.And(fieldKeyConds...))
}
limit += fieldKeySelector.Limit
}
if len(conds) > 0 {
sb.Where(sb.Or(conds...))
}
sb.GroupBy("name", "datatype")
if limit == 0 {
limit = 1000
}
query, args := sb.BuildWithFlavor(sqlbuilder.ClickHouse)
queries = append(queries, query)
allArgs = append(allArgs, args...)
}
if len(queries) == 0 {
return []*telemetrytypes.TelemetryFieldKey{}, true, nil
}
var limit int
for _, fieldKeySelector := range fieldKeySelectors {
limit += fieldKeySelector.Limit
}
if limit == 0 {
limit = 1000
}
mainQuery := fmt.Sprintf(`
SELECT tag_key, tag_type, tag_data_type, max(priority) as priority
FROM (
%s
) AS combined_results
GROUP BY tag_key, tag_type, tag_data_type
ORDER BY priority
LIMIT %d
`, strings.Join(queries, " UNION ALL "), limit+1)
rows, err := t.telemetrystore.ClickhouseDB().Query(ctx, mainQuery, allArgs...)
if err != nil {
return nil, false, errors.Wrap(err, errors.TypeInternal, errors.CodeInternal, ErrFailedToGetAuditKeys.Error())
}
defer rows.Close()
keys := []*telemetrytypes.TelemetryFieldKey{}
rowCount := 0
searchTexts := []string{}
for _, fieldKeySelector := range fieldKeySelectors {
searchTexts = append(searchTexts, fieldKeySelector.Name)
}
for rows.Next() {
rowCount++
if rowCount > limit {
break
}
var name string
var fieldContext telemetrytypes.FieldContext
var fieldDataType telemetrytypes.FieldDataType
var priority uint8
err = rows.Scan(&name, &fieldContext, &fieldDataType, &priority)
if err != nil {
return nil, false, errors.Wrap(err, errors.TypeInternal, errors.CodeInternal, ErrFailedToGetAuditKeys.Error())
}
key, ok := mapOfKeys[name+";"+fieldContext.StringValue()+";"+fieldDataType.StringValue()]
if !ok {
key = &telemetrytypes.TelemetryFieldKey{
Name: name,
Signal: telemetrytypes.SignalLogs,
FieldContext: fieldContext,
FieldDataType: fieldDataType,
}
}
keys = append(keys, key)
mapOfKeys[name+";"+fieldContext.StringValue()+";"+fieldDataType.StringValue()] = key
}
if rows.Err() != nil {
return nil, false, errors.Wrap(rows.Err(), errors.TypeInternal, errors.CodeInternal, ErrFailedToGetAuditKeys.Error())
}
complete := rowCount <= limit
// Add intrinsic audit fields (same as logs intrinsics: body, severity_text, etc.)
staticKeys := maps.Keys(telemetryaudit.IntrinsicFields)
for _, key := range staticKeys {
found := false
for _, v := range searchTexts {
if v == "" || strings.Contains(key, v) {
found = true
break
}
}
if found {
if field, exists := telemetryaudit.IntrinsicFields[key]; exists {
if _, added := mapOfKeys[field.Name+";"+field.FieldContext.StringValue()+";"+field.FieldDataType.StringValue()]; !added {
keys = append(keys, &field)
}
}
}
}
return keys, complete, nil
}
func getPriorityForContext(ctx telemetrytypes.FieldContext) int {
switch ctx {
case telemetrytypes.FieldContextLog:
@@ -889,7 +1127,11 @@ func (t *telemetryMetaStore) GetKeys(ctx context.Context, fieldKeySelector *tele
case telemetrytypes.SignalTraces:
keys, complete, err = t.getTracesKeys(ctx, selectors)
case telemetrytypes.SignalLogs:
keys, complete, err = t.getLogsKeys(ctx, selectors)
if fieldKeySelector.Source == telemetrytypes.SourceAudit {
keys, complete, err = t.getAuditKeys(ctx, selectors)
} else {
keys, complete, err = t.getLogsKeys(ctx, selectors)
}
case telemetrytypes.SignalMetrics:
if fieldKeySelector.Source == telemetrytypes.SourceMeter {
keys, complete, err = t.getMeterSourceMetricKeys(ctx, selectors)
@@ -938,6 +1180,7 @@ func (t *telemetryMetaStore) GetKeys(ctx context.Context, fieldKeySelector *tele
func (t *telemetryMetaStore) GetKeysMulti(ctx context.Context, fieldKeySelectors []*telemetrytypes.FieldKeySelector) (map[string][]*telemetrytypes.TelemetryFieldKey, bool, error) {
logsSelectors := []*telemetrytypes.FieldKeySelector{}
auditSelectors := []*telemetrytypes.FieldKeySelector{}
tracesSelectors := []*telemetrytypes.FieldKeySelector{}
metricsSelectors := []*telemetrytypes.FieldKeySelector{}
meterSourceMetricsSelectors := []*telemetrytypes.FieldKeySelector{}
@@ -945,7 +1188,11 @@ func (t *telemetryMetaStore) GetKeysMulti(ctx context.Context, fieldKeySelectors
for _, fieldKeySelector := range fieldKeySelectors {
switch fieldKeySelector.Signal {
case telemetrytypes.SignalLogs:
logsSelectors = append(logsSelectors, fieldKeySelector)
if fieldKeySelector.Source == telemetrytypes.SourceAudit {
auditSelectors = append(auditSelectors, fieldKeySelector)
} else {
logsSelectors = append(logsSelectors, fieldKeySelector)
}
case telemetrytypes.SignalTraces:
tracesSelectors = append(tracesSelectors, fieldKeySelector)
case telemetrytypes.SignalMetrics:
@@ -965,6 +1212,10 @@ func (t *telemetryMetaStore) GetKeysMulti(ctx context.Context, fieldKeySelectors
if err != nil {
return nil, false, err
}
auditKeys, auditComplete, err := t.getAuditKeys(ctx, auditSelectors)
if err != nil {
return nil, false, err
}
tracesKeys, tracesComplete, err := t.getTracesKeys(ctx, tracesSelectors)
if err != nil {
return nil, false, err
@@ -979,12 +1230,15 @@ func (t *telemetryMetaStore) GetKeysMulti(ctx context.Context, fieldKeySelectors
return nil, false, err
}
// Complete only if all queries are complete
complete := logsComplete && tracesComplete && metricsComplete
complete := logsComplete && auditComplete && tracesComplete && metricsComplete
mapOfKeys := make(map[string][]*telemetrytypes.TelemetryFieldKey)
for _, key := range logsKeys {
mapOfKeys[key.Name] = append(mapOfKeys[key.Name], key)
}
for _, key := range auditKeys {
mapOfKeys[key.Name] = append(mapOfKeys[key.Name], key)
}
for _, key := range tracesKeys {
mapOfKeys[key.Name] = append(mapOfKeys[key.Name], key)
}
@@ -1338,6 +1592,97 @@ func (t *telemetryMetaStore) getLogFieldValues(ctx context.Context, fieldValueSe
return values, complete, nil
}
func (t *telemetryMetaStore) getAuditFieldValues(ctx context.Context, fieldValueSelector *telemetrytypes.FieldValueSelector) (*telemetrytypes.TelemetryFieldValues, bool, error) {
ctx = ctxtypes.NewContextWithCommentVals(ctx, map[string]string{
instrumentationtypes.TelemetrySignal: telemetrytypes.SignalLogs.StringValue(),
instrumentationtypes.CodeNamespace: "metadata",
instrumentationtypes.CodeFunctionName: "getAuditFieldValues",
})
limit := fieldValueSelector.Limit
if limit == 0 {
limit = 50
}
sb := sqlbuilder.Select("DISTINCT string_value, number_value").From(t.auditDBName + "." + t.auditFieldsTblName)
if fieldValueSelector.Name != "" {
sb.Where(sb.E("tag_key", fieldValueSelector.Name))
}
if fieldValueSelector.FieldContext != telemetrytypes.FieldContextUnspecified {
sb.Where(sb.E("tag_type", fieldValueSelector.FieldContext.TagType()))
}
if fieldValueSelector.FieldDataType != telemetrytypes.FieldDataTypeUnspecified {
sb.Where(sb.E("tag_data_type", fieldValueSelector.FieldDataType.TagDataType()))
}
if fieldValueSelector.Value != "" {
switch fieldValueSelector.FieldDataType {
case telemetrytypes.FieldDataTypeString:
sb.Where(sb.ILike("string_value", "%"+escapeForLike(fieldValueSelector.Value)+"%"))
case telemetrytypes.FieldDataTypeNumber:
sb.Where(sb.IsNotNull("number_value"))
sb.Where(sb.ILike("toString(number_value)", "%"+escapeForLike(fieldValueSelector.Value)+"%"))
case telemetrytypes.FieldDataTypeUnspecified:
sb.Where(sb.Or(
sb.ILike("string_value", "%"+escapeForLike(fieldValueSelector.Value)+"%"),
sb.ILike("toString(number_value)", "%"+escapeForLike(fieldValueSelector.Value)+"%"),
))
}
}
// fetch one extra row to detect whether the result set is complete
sb.Limit(limit + 1)
query, args := sb.BuildWithFlavor(sqlbuilder.ClickHouse)
rows, err := t.telemetrystore.ClickhouseDB().Query(ctx, query, args...)
if err != nil {
return nil, false, errors.Wrap(err, errors.TypeInternal, errors.CodeInternal, ErrFailedToGetAuditKeys.Error())
}
defer rows.Close()
values := &telemetrytypes.TelemetryFieldValues{}
seen := make(map[string]bool)
rowCount := 0
totalCount := 0
for rows.Next() {
rowCount++
var stringValue string
var numberValue float64
err = rows.Scan(&stringValue, &numberValue)
if err != nil {
return nil, false, errors.Wrap(err, errors.TypeInternal, errors.CodeInternal, ErrFailedToGetAuditKeys.Error())
}
if stringValue != "" && !seen[stringValue] {
if totalCount >= limit {
break
}
values.StringValues = append(values.StringValues, stringValue)
seen[stringValue] = true
totalCount++
}
if numberValue != 0 {
if totalCount >= limit {
break
}
if !seen[fmt.Sprintf("%f", numberValue)] {
values.NumberValues = append(values.NumberValues, numberValue)
seen[fmt.Sprintf("%f", numberValue)] = true
totalCount++
}
}
}
complete := rowCount <= limit
return values, complete, nil
}
// getMetricFieldValues returns field values and whether the result is complete.
func (t *telemetryMetaStore) getMetricFieldValues(ctx context.Context, fieldValueSelector *telemetrytypes.FieldValueSelector) (*telemetrytypes.TelemetryFieldValues, bool, error) {
ctx = ctxtypes.NewContextWithCommentVals(ctx, map[string]string{
@@ -1628,7 +1973,11 @@ func (t *telemetryMetaStore) GetAllValues(ctx context.Context, fieldValueSelecto
case telemetrytypes.SignalTraces:
values, complete, err = t.getSpanFieldValues(ctx, fieldValueSelector)
case telemetrytypes.SignalLogs:
values, complete, err = t.getLogFieldValues(ctx, fieldValueSelector)
if fieldValueSelector.Source == telemetrytypes.SourceAudit {
values, complete, err = t.getAuditFieldValues(ctx, fieldValueSelector)
} else {
values, complete, err = t.getLogFieldValues(ctx, fieldValueSelector)
}
case telemetrytypes.SignalMetrics:
if fieldValueSelector.Source == telemetrytypes.SourceMeter {
values, complete, err = t.getMeterSourceMetricFieldValues(ctx, fieldValueSelector)

View File

@@ -5,6 +5,7 @@ import (
"testing"
"github.com/SigNoz/signoz/pkg/instrumentation/instrumentationtest"
"github.com/SigNoz/signoz/pkg/telemetryaudit"
"github.com/SigNoz/signoz/pkg/telemetrylogs"
"github.com/SigNoz/signoz/pkg/telemetrymeter"
"github.com/SigNoz/signoz/pkg/telemetrymetrics"
@@ -37,6 +38,11 @@ func TestGetFirstSeenFromMetricMetadata(t *testing.T) {
telemetrylogs.TagAttributesV2TableName,
telemetrylogs.LogAttributeKeysTblName,
telemetrylogs.LogResourceKeysTblName,
telemetryaudit.DBName,
telemetryaudit.AuditLogsTableName,
telemetryaudit.TagAttributesTableName,
telemetryaudit.LogAttributeKeysTblName,
telemetryaudit.LogResourceKeysTblName,
DBName,
AttributesMetadataLocalTableName,
ColumnEvolutionMetadataTableName,

View File

@@ -7,6 +7,7 @@ import (
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/instrumentation/instrumentationtest"
"github.com/SigNoz/signoz/pkg/telemetryaudit"
"github.com/SigNoz/signoz/pkg/telemetrylogs"
"github.com/SigNoz/signoz/pkg/telemetrymeter"
"github.com/SigNoz/signoz/pkg/telemetrymetrics"
@@ -36,6 +37,11 @@ func newTestTelemetryMetaStoreTestHelper(store telemetrystore.TelemetryStore) te
telemetrylogs.TagAttributesV2TableName,
telemetrylogs.LogAttributeKeysTblName,
telemetrylogs.LogResourceKeysTblName,
telemetryaudit.DBName,
telemetryaudit.AuditLogsTableName,
telemetryaudit.TagAttributesTableName,
telemetryaudit.LogAttributeKeysTblName,
telemetryaudit.LogResourceKeysTblName,
DBName,
AttributesMetadataLocalTableName,
ColumnEvolutionMetadataTableName,

View File

@@ -9,6 +9,6 @@ const (
ColumnEvolutionMetadataTableName = "distributed_column_evolution_metadata"
PathTypesTableName = otelcollectorconst.DistributedPathTypesTable
// Column Evolution table stores promoted paths as (signal, column_name, field_context, field_name); see signoz-otel-collector metadata_migrations.
PromotedPathsTableName = "distributed_column_evolution_metadata"
SkipIndexTableName = "system.data_skipping_indices"
PromotedPathsTableName = "distributed_column_evolution_metadata"
SkipIndexTableName = "system.data_skipping_indices"
)

View File

@@ -21,6 +21,7 @@ type resourceFilterStatementBuilder[T any] struct {
conditionBuilder qbtypes.ConditionBuilder
metadataStore telemetrytypes.MetadataStore
signal telemetrytypes.Signal
source telemetrytypes.Source
fullTextColumn *telemetrytypes.TelemetryFieldKey
jsonKeyToKey qbtypes.JsonKeyToFieldFunc
@@ -37,6 +38,7 @@ func New[T any](
dbName string,
tableName string,
signal telemetrytypes.Signal,
source telemetrytypes.Source,
metadataStore telemetrytypes.MetadataStore,
fullTextColumn *telemetrytypes.TelemetryFieldKey,
jsonKeyToKey qbtypes.JsonKeyToFieldFunc,
@@ -52,6 +54,7 @@ func New[T any](
conditionBuilder: cb,
metadataStore: metadataStore,
signal: signal,
source: source,
fullTextColumn: fullTextColumn,
jsonKeyToKey: jsonKeyToKey,
}
@@ -72,6 +75,7 @@ func (b *resourceFilterStatementBuilder[T]) getKeySelectors(query qbtypes.QueryB
continue
}
keySelectors[idx].Signal = b.signal
keySelectors[idx].Source = b.source
keySelectors[idx].SelectorMatchType = telemetrytypes.FieldSelectorMatchTypeExact
filteredKeySelectors = append(filteredKeySelectors, keySelectors[idx])
}

View File

@@ -375,6 +375,7 @@ func TestResourceFilterStatementBuilder_Traces(t *testing.T) {
"signoz_traces",
"distributed_traces_v3_resource",
telemetrytypes.SignalTraces,
telemetrytypes.SourceUnspecified,
mockMetadataStore,
nil,
nil,
@@ -592,6 +593,7 @@ func TestResourceFilterStatementBuilder_Logs(t *testing.T) {
"signoz_logs",
"distributed_logs_v2_resource",
telemetrytypes.SignalLogs,
telemetrytypes.SourceUnspecified,
mockMetadataStore,
nil,
nil,
@@ -653,6 +655,7 @@ func TestResourceFilterStatementBuilder_Variables(t *testing.T) {
"signoz_traces",
"distributed_traces_v3_resource",
telemetrytypes.SignalTraces,
telemetrytypes.SourceUnspecified,
mockMetadataStore,
nil,
nil,

View File

@@ -49,6 +49,7 @@ func NewTraceQueryStatementBuilder(
DBName,
TracesResourceV3TableName,
telemetrytypes.SignalTraces,
telemetrytypes.SourceUnspecified,
metadataStore,
nil,
nil,

View File

@@ -39,6 +39,7 @@ func NewTraceOperatorStatementBuilder(
DBName,
TracesResourceV3TableName,
telemetrytypes.SignalTraces,
telemetrytypes.SourceUnspecified,
metadataStore,
nil,
nil,

View File

@@ -7,11 +7,13 @@ type Source struct {
}
var (
SourceAudit = Source{valuer.NewString("audit")}
SourceMeter = Source{valuer.NewString("meter")}
SourceUnspecified = Source{valuer.NewString("")}
)
// Enum returns the acceptable values for Source.
// TODO: Add SourceAudit once the frontend is ready for consumption.
func (Source) Enum() []any {
return []any{
SourceMeter,

View File

@@ -12,6 +12,7 @@ pytest_plugins = [
"fixtures.sqlite",
"fixtures.zookeeper",
"fixtures.signoz",
"fixtures.audit",
"fixtures.logs",
"fixtures.traces",
"fixtures.metrics",

View File

@@ -0,0 +1,404 @@
import datetime
import json
from abc import ABC
from typing import Any, Callable, Generator, List, Optional
import numpy as np
import pytest
from ksuid import KsuidMs
from fixtures import types
from fixtures.fingerprint import LogsOrTracesFingerprint
class AuditResource(ABC):
labels: str
fingerprint: str
seen_at_ts_bucket_start: np.int64
def __init__(
self,
labels: dict[str, str],
fingerprint: str,
seen_at_ts_bucket_start: np.int64,
) -> None:
self.labels = json.dumps(labels, separators=(",", ":"))
self.fingerprint = fingerprint
self.seen_at_ts_bucket_start = seen_at_ts_bucket_start
def np_arr(self) -> np.array:
return np.array(
[
self.labels,
self.fingerprint,
self.seen_at_ts_bucket_start,
]
)
class AuditResourceOrAttributeKeys(ABC):
name: str
datatype: str
def __init__(self, name: str, datatype: str) -> None:
self.name = name
self.datatype = datatype
def np_arr(self) -> np.array:
return np.array([self.name, self.datatype])
class AuditTagAttributes(ABC):
unix_milli: np.int64
tag_key: str
tag_type: str
tag_data_type: str
string_value: str
int64_value: Optional[np.int64]
float64_value: Optional[np.float64]
def __init__(
self,
timestamp: datetime.datetime,
tag_key: str,
tag_type: str,
tag_data_type: str,
string_value: Optional[str],
int64_value: Optional[np.int64],
float64_value: Optional[np.float64],
) -> None:
self.unix_milli = np.int64(int(timestamp.timestamp() * 1e3))
self.tag_key = tag_key
self.tag_type = tag_type
self.tag_data_type = tag_data_type
self.string_value = string_value or ""
self.int64_value = int64_value
self.float64_value = float64_value
def np_arr(self) -> np.array:
return np.array(
[
self.unix_milli,
self.tag_key,
self.tag_type,
self.tag_data_type,
self.string_value,
self.int64_value,
self.float64_value,
]
)
class AuditLog(ABC):
"""Represents a single audit log event in signoz_audit.
Matches the ClickHouse DDL from the schema migration (ticket #1936):
- Database: signoz_audit
- Local table: logs
- Distributed table: distributed_logs
- No resources_string column (resource JSON only)
- Has event_name column
- 7 materialized columns auto-populated from attributes_string at INSERT time
"""
ts_bucket_start: np.uint64
resource_fingerprint: str
timestamp: np.uint64
observed_timestamp: np.uint64
id: str
trace_id: str
span_id: str
trace_flags: np.uint32
severity_text: str
severity_number: np.uint8
body: str
scope_name: str
scope_version: str
scope_string: dict[str, str]
attributes_string: dict[str, str]
attributes_number: dict[str, np.float64]
attributes_bool: dict[str, bool]
resource_json: dict[str, str]
event_name: str
resource: List[AuditResource]
tag_attributes: List[AuditTagAttributes]
resource_keys: List[AuditResourceOrAttributeKeys]
attribute_keys: List[AuditResourceOrAttributeKeys]
def __init__(
self,
timestamp: Optional[datetime.datetime] = None,
resources: dict[str, Any] = {},
attributes: dict[str, Any] = {},
body: str = "",
event_name: str = "",
severity_text: str = "INFO",
trace_id: str = "",
span_id: str = "",
trace_flags: np.uint32 = 0,
scope_name: str = "signoz.audit",
scope_version: str = "",
) -> None:
if timestamp is None:
timestamp = datetime.datetime.now()
self.tag_attributes = []
self.attribute_keys = []
self.resource_keys = []
self.timestamp = np.uint64(int(timestamp.timestamp() * 1e9))
self.observed_timestamp = self.timestamp
minute = timestamp.minute
bucket_minute = 0 if minute < 30 else 30
bucket_start = timestamp.replace(minute=bucket_minute, second=0, microsecond=0)
self.ts_bucket_start = np.uint64(int(bucket_start.timestamp()))
self.id = str(KsuidMs(datetime=timestamp))
self.trace_id = trace_id
self.span_id = span_id
self.trace_flags = trace_flags
self.severity_text = severity_text
self.severity_number = np.uint8(9 if severity_text == "INFO" else 17)
self.body = body
self.event_name = event_name
# Resources — JSON column only (no resources_string in audit DDL)
self.resource_json = {k: str(v) for k, v in resources.items()}
for k, v in self.resource_json.items():
self.tag_attributes.append(
AuditTagAttributes(
timestamp=timestamp,
tag_key=k,
tag_type="resource",
tag_data_type="string",
string_value=str(v),
int64_value=None,
float64_value=None,
)
)
self.resource_keys.append(
AuditResourceOrAttributeKeys(name=k, datatype="string")
)
self.resource_fingerprint = LogsOrTracesFingerprint(
self.resource_json
).calculate()
# Process attributes by type
self.attributes_string = {}
self.attributes_number = {}
self.attributes_bool = {}
for k, v in attributes.items():
if isinstance(v, bool):
self.attributes_bool[k] = v
self.tag_attributes.append(
AuditTagAttributes(
timestamp=timestamp,
tag_key=k,
tag_type="tag",
tag_data_type="bool",
string_value=None,
int64_value=None,
float64_value=None,
)
)
self.attribute_keys.append(
AuditResourceOrAttributeKeys(name=k, datatype="bool")
)
elif isinstance(v, int):
self.attributes_number[k] = np.float64(v)
self.tag_attributes.append(
AuditTagAttributes(
timestamp=timestamp,
tag_key=k,
tag_type="tag",
tag_data_type="int64",
string_value=None,
int64_value=np.int64(v),
float64_value=None,
)
)
self.attribute_keys.append(
AuditResourceOrAttributeKeys(name=k, datatype="int64")
)
elif isinstance(v, float):
self.attributes_number[k] = np.float64(v)
self.tag_attributes.append(
AuditTagAttributes(
timestamp=timestamp,
tag_key=k,
tag_type="tag",
tag_data_type="float64",
string_value=None,
int64_value=None,
float64_value=np.float64(v),
)
)
self.attribute_keys.append(
AuditResourceOrAttributeKeys(name=k, datatype="float64")
)
else:
self.attributes_string[k] = str(v)
self.tag_attributes.append(
AuditTagAttributes(
timestamp=timestamp,
tag_key=k,
tag_type="tag",
tag_data_type="string",
string_value=str(v),
int64_value=None,
float64_value=None,
)
)
self.attribute_keys.append(
AuditResourceOrAttributeKeys(name=k, datatype="string")
)
self.scope_name = scope_name
self.scope_version = scope_version
self.scope_string = {}
self.resource = [
AuditResource(
labels=self.resource_json,
fingerprint=self.resource_fingerprint,
seen_at_ts_bucket_start=self.ts_bucket_start,
)
]
def np_arr(self) -> np.array:
return np.array(
[
self.ts_bucket_start,
self.resource_fingerprint,
self.timestamp,
self.observed_timestamp,
self.id,
self.trace_id,
self.span_id,
self.trace_flags,
self.severity_text,
self.severity_number,
self.body,
self.scope_name,
self.scope_version,
self.scope_string,
self.attributes_string,
self.attributes_number,
self.attributes_bool,
self.resource_json,
self.event_name,
]
)
@pytest.fixture(name="insert_audit_logs", scope="function")
def insert_audit_logs(
clickhouse: types.TestContainerClickhouse,
) -> Generator[Callable[[List[AuditLog]], None], Any, None]:
def _insert_audit_logs(logs: List[AuditLog]) -> None:
resources: List[AuditResource] = []
for log in logs:
resources.extend(log.resource)
if len(resources) > 0:
clickhouse.conn.insert(
database="signoz_audit",
table="distributed_logs_resource",
data=[resource.np_arr() for resource in resources],
column_names=[
"labels",
"fingerprint",
"seen_at_ts_bucket_start",
],
)
tag_attributes: List[AuditTagAttributes] = []
for log in logs:
tag_attributes.extend(log.tag_attributes)
if len(tag_attributes) > 0:
clickhouse.conn.insert(
database="signoz_audit",
table="distributed_tag_attributes",
data=[ta.np_arr() for ta in tag_attributes],
column_names=[
"unix_milli",
"tag_key",
"tag_type",
"tag_data_type",
"string_value",
"int64_value",
"float64_value",
],
)
attribute_keys: List[AuditResourceOrAttributeKeys] = []
for log in logs:
attribute_keys.extend(log.attribute_keys)
if len(attribute_keys) > 0:
clickhouse.conn.insert(
database="signoz_audit",
table="distributed_logs_attribute_keys",
data=[ak.np_arr() for ak in attribute_keys],
column_names=["name", "datatype"],
)
resource_keys: List[AuditResourceOrAttributeKeys] = []
for log in logs:
resource_keys.extend(log.resource_keys)
if len(resource_keys) > 0:
clickhouse.conn.insert(
database="signoz_audit",
table="distributed_logs_resource_keys",
data=[rk.np_arr() for rk in resource_keys],
column_names=["name", "datatype"],
)
clickhouse.conn.insert(
database="signoz_audit",
table="distributed_logs",
data=[log.np_arr() for log in logs],
column_names=[
"ts_bucket_start",
"resource_fingerprint",
"timestamp",
"observed_timestamp",
"id",
"trace_id",
"span_id",
"trace_flags",
"severity_text",
"severity_number",
"body",
"scope_name",
"scope_version",
"scope_string",
"attributes_string",
"attributes_number",
"attributes_bool",
"resource",
"event_name",
],
)
yield _insert_audit_logs
cluster = clickhouse.env["SIGNOZ_TELEMETRYSTORE_CLICKHOUSE_CLUSTER"]
for table in [
"logs",
"logs_resource",
"tag_attributes",
"logs_attribute_keys",
"logs_resource_keys",
]:
clickhouse.conn.query(
f"TRUNCATE TABLE signoz_audit.{table} ON CLUSTER '{cluster}' SYNC"
)

View File

@@ -38,6 +38,7 @@ class OrderBy:
class BuilderQuery:
signal: str
name: str = "A"
source: Optional[str] = None
limit: Optional[int] = None
filter_expression: Optional[str] = None
select_fields: Optional[List[TelemetryFieldKey]] = None
@@ -48,6 +49,8 @@ class BuilderQuery:
"signal": self.signal,
"name": self.name,
}
if self.source:
spec["source"] = self.source
if self.limit is not None:
spec["limit"] = self.limit
if self.filter_expression:
@@ -55,7 +58,9 @@ class BuilderQuery:
if self.select_fields:
spec["selectFields"] = [f.to_dict() for f in self.select_fields]
if self.order:
spec["order"] = [o.to_dict() for o in self.order]
spec["order"] = [
o.to_dict() if hasattr(o, "to_dict") else o for o in self.order
]
return {"type": "builder_query", "spec": spec}
@@ -76,7 +81,9 @@ class TraceOperatorQuery:
if self.limit is not None:
spec["limit"] = self.limit
if self.order:
spec["order"] = [o.to_dict() for o in self.order]
spec["order"] = [
o.to_dict() if hasattr(o, "to_dict") else o for o in self.order
]
return {"type": "builder_trace_operator", "spec": spec}
@@ -442,6 +449,7 @@ def build_scalar_query(
signal: str,
aggregations: List[Dict],
*,
source: Optional[str] = None,
group_by: Optional[List[Dict]] = None,
order: Optional[List[Dict]] = None,
limit: Optional[int] = None,
@@ -458,6 +466,9 @@ def build_scalar_query(
"aggregations": aggregations,
}
if source:
spec["source"] = source
if group_by:
spec["groupBy"] = group_by

View File

@@ -0,0 +1,441 @@
from datetime import datetime, timedelta, timezone
from http import HTTPStatus
from typing import Callable, List
import pytest
from fixtures import types
from fixtures.audit import AuditLog
from fixtures.auth import USER_ADMIN_EMAIL, USER_ADMIN_PASSWORD
from fixtures.querier import (
BuilderQuery,
build_logs_aggregation,
build_order_by,
build_scalar_query,
make_query_request,
)
def test_audit_list_all(
signoz: types.SigNoz,
create_user_admin: None, # pylint: disable=unused-argument
get_token: Callable[[str, str], str],
insert_audit_logs: Callable[[List[AuditLog]], None],
) -> None:
"""List audit events across multiple resource types — verify count, ordering, and fields."""
now = datetime.now(tz=timezone.utc)
insert_audit_logs(
[
AuditLog(
timestamp=now - timedelta(seconds=3),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "alert-rule",
"signoz.audit.resource.id": "alert-001",
},
attributes={
"signoz.audit.principal.id": "user-010",
"signoz.audit.principal.email": "ops@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "create",
"signoz.audit.outcome": "success",
},
body="ops@acme.com (user-010) created alert-rule (alert-001)",
event_name="alert-rule.created",
severity_text="INFO",
),
AuditLog(
timestamp=now - timedelta(seconds=2),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "saved-view",
"signoz.audit.resource.id": "view-001",
},
attributes={
"signoz.audit.principal.id": "user-010",
"signoz.audit.principal.email": "ops@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "update",
"signoz.audit.outcome": "success",
},
body="ops@acme.com (user-010) updated saved-view (view-001)",
event_name="saved-view.updated",
severity_text="INFO",
),
AuditLog(
timestamp=now - timedelta(seconds=1),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "user",
"signoz.audit.resource.id": "user-020",
},
attributes={
"signoz.audit.principal.id": "user-010",
"signoz.audit.principal.email": "ops@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "update",
"signoz.audit.action_category": "access_control",
"signoz.audit.outcome": "success",
},
body="ops@acme.com (user-010) updated user (user-020)",
event_name="user.role.changed",
severity_text="INFO",
),
]
)
token = get_token(USER_ADMIN_EMAIL, USER_ADMIN_PASSWORD)
now = datetime.now(tz=timezone.utc)
response = make_query_request(
signoz,
token,
start_ms=int((now - timedelta(seconds=30)).timestamp() * 1000),
end_ms=int(now.timestamp() * 1000),
queries=[
BuilderQuery(
signal="logs",
source="audit",
limit=100,
order=[build_order_by("timestamp"), build_order_by("id")],
).to_dict()
],
request_type="raw",
)
assert response.status_code == HTTPStatus.OK
assert response.json()["status"] == "success"
rows = response.json()["data"]["data"]["results"][0]["rows"]
assert len(rows) == 3
# Most recent first
assert rows[0]["data"]["event_name"] == "user.role.changed"
assert rows[1]["data"]["event_name"] == "saved-view.updated"
assert rows[2]["data"]["event_name"] == "alert-rule.created"
# Verify event_name and body are present
assert rows[0]["data"]["body"] == "ops@acme.com (user-010) updated user (user-020)"
assert rows[0]["data"]["severity_text"] == "INFO"
@pytest.mark.parametrize(
"filter_expression,expected_count,expected_event_names",
[
pytest.param(
"signoz.audit.principal.id = 'user-001'",
3,
{"session.login", "dashboard.updated", "dashboard.created"},
id="filter_by_principal_id",
),
pytest.param(
"signoz.audit.outcome = 'failure'",
1,
{"dashboard.deleted"},
id="filter_by_outcome_failure",
),
pytest.param(
"signoz.audit.resource.kind = 'dashboard'"
" AND signoz.audit.resource.id = 'dash-001'",
3,
{"dashboard.deleted", "dashboard.updated", "dashboard.created"},
id="filter_by_resource_kind_and_id",
),
pytest.param(
"signoz.audit.principal.type = 'service_account'",
1,
{"serviceaccount.apikey.created"},
id="filter_by_principal_type",
),
pytest.param(
"signoz.audit.resource.kind = 'dashboard'"
" AND signoz.audit.action = 'delete'",
1,
{"dashboard.deleted"},
id="filter_by_resource_kind_and_action",
),
],
)
def test_audit_filter(
signoz: types.SigNoz,
create_user_admin: None, # pylint: disable=unused-argument
get_token: Callable[[str, str], str],
insert_audit_logs: Callable[[List[AuditLog]], None],
filter_expression: str,
expected_count: int,
expected_event_names: set,
) -> None:
"""Parametrized audit filter tests covering the documented query patterns."""
now = datetime.now(tz=timezone.utc)
insert_audit_logs(
[
AuditLog(
timestamp=now - timedelta(seconds=5),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "dashboard",
"signoz.audit.resource.id": "dash-001",
},
attributes={
"signoz.audit.principal.id": "user-001",
"signoz.audit.principal.email": "alice@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "create",
"signoz.audit.action_category": "configuration_change",
"signoz.audit.outcome": "success",
},
body="alice@acme.com created dashboard",
event_name="dashboard.created",
),
AuditLog(
timestamp=now - timedelta(seconds=4),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "dashboard",
"signoz.audit.resource.id": "dash-001",
},
attributes={
"signoz.audit.principal.id": "user-001",
"signoz.audit.principal.email": "alice@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "update",
"signoz.audit.action_category": "configuration_change",
"signoz.audit.outcome": "success",
},
body="alice@acme.com updated dashboard",
event_name="dashboard.updated",
),
AuditLog(
timestamp=now - timedelta(seconds=3),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "dashboard",
"signoz.audit.resource.id": "dash-001",
},
attributes={
"signoz.audit.principal.id": "user-002",
"signoz.audit.principal.email": "viewer@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "delete",
"signoz.audit.action_category": "configuration_change",
"signoz.audit.outcome": "failure",
"signoz.audit.error.type": "forbidden",
"signoz.audit.error.code": "authz_forbidden",
},
body="viewer@acme.com failed to delete dashboard",
event_name="dashboard.deleted",
severity_text="ERROR",
),
AuditLog(
timestamp=now - timedelta(seconds=2),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "serviceaccount",
"signoz.audit.resource.id": "sa-001",
},
attributes={
"signoz.audit.principal.id": "sa-001",
"signoz.audit.principal.email": "",
"signoz.audit.principal.type": "service_account",
"signoz.audit.action": "create",
"signoz.audit.action_category": "access_control",
"signoz.audit.outcome": "success",
},
body="sa-001 created serviceaccount",
event_name="serviceaccount.apikey.created",
),
AuditLog(
timestamp=now - timedelta(seconds=1),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "session",
"signoz.audit.resource.id": "*",
},
attributes={
"signoz.audit.principal.id": "user-001",
"signoz.audit.principal.email": "alice@acme.com",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "login",
"signoz.audit.action_category": "access_control",
"signoz.audit.outcome": "success",
},
body="alice@acme.com login session",
event_name="session.login",
),
]
)
token = get_token(USER_ADMIN_EMAIL, USER_ADMIN_PASSWORD)
now = datetime.now(tz=timezone.utc)
response = make_query_request(
signoz,
token,
start_ms=int((now - timedelta(seconds=30)).timestamp() * 1000),
end_ms=int(now.timestamp() * 1000),
queries=[
BuilderQuery(
signal="logs",
source="audit",
limit=100,
filter_expression=filter_expression,
order=[build_order_by("timestamp"), build_order_by("id")],
).to_dict()
],
request_type="raw",
)
assert response.status_code == HTTPStatus.OK
rows = response.json()["data"]["data"]["results"][0]["rows"]
assert len(rows) == expected_count
actual_event_names = {row["data"]["event_name"] for row in rows}
assert actual_event_names == expected_event_names
def test_audit_scalar_count_failures(
signoz: types.SigNoz,
create_user_admin: None, # pylint: disable=unused-argument
get_token: Callable[[str, str], str],
insert_audit_logs: Callable[[List[AuditLog]], None],
) -> None:
"""Alert query — count multiple failures from different principals."""
now = datetime.now(tz=timezone.utc)
insert_audit_logs(
[
AuditLog(
timestamp=now - timedelta(seconds=3),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "dashboard",
"signoz.audit.resource.id": "dash-100",
},
attributes={
"signoz.audit.principal.id": "user-050",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "delete",
"signoz.audit.outcome": "failure",
},
body="user-050 failed to delete dashboard",
event_name="dashboard.deleted",
severity_text="ERROR",
),
AuditLog(
timestamp=now - timedelta(seconds=2),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "alert-rule",
"signoz.audit.resource.id": "alert-200",
},
attributes={
"signoz.audit.principal.id": "user-060",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "update",
"signoz.audit.outcome": "failure",
},
body="user-060 failed to update alert-rule",
event_name="alert-rule.updated",
severity_text="ERROR",
),
AuditLog(
timestamp=now - timedelta(seconds=1),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "dashboard",
"signoz.audit.resource.id": "dash-100",
},
attributes={
"signoz.audit.principal.id": "user-050",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "update",
"signoz.audit.outcome": "success",
},
body="user-050 updated dashboard",
event_name="dashboard.updated",
),
]
)
token = get_token(USER_ADMIN_EMAIL, USER_ADMIN_PASSWORD)
now = datetime.now(tz=timezone.utc)
response = make_query_request(
signoz,
token,
start_ms=int((now - timedelta(seconds=30)).timestamp() * 1000),
end_ms=int(now.timestamp() * 1000),
queries=[
build_scalar_query(
name="A",
signal="logs",
source="audit",
aggregations=[build_logs_aggregation("count()")],
filter_expression="signoz.audit.outcome = 'failure'",
)
],
request_type="scalar",
)
assert response.status_code == HTTPStatus.OK
assert response.json()["status"] == "success"
scalar_data = response.json()["data"]["data"]["results"][0].get("data", [])
assert len(scalar_data) == 1
assert scalar_data[0][0] == 2
def test_audit_does_not_leak_into_logs(
signoz: types.SigNoz,
create_user_admin: None, # pylint: disable=unused-argument
get_token: Callable[[str, str], str],
insert_audit_logs: Callable[[List[AuditLog]], None],
) -> None:
"""A single audit event in signoz_audit must not appear in regular log queries."""
now = datetime.now(tz=timezone.utc)
insert_audit_logs(
[
AuditLog(
timestamp=now - timedelta(seconds=1),
resources={
"service.name": "signoz",
"signoz.audit.resource.kind": "organization",
"signoz.audit.resource.id": "org-999",
},
attributes={
"signoz.audit.principal.id": "user-admin",
"signoz.audit.principal.type": "user",
"signoz.audit.action": "update",
"signoz.audit.outcome": "success",
},
body="user-admin updated organization (org-999)",
event_name="organization.updated",
),
]
)
token = get_token(USER_ADMIN_EMAIL, USER_ADMIN_PASSWORD)
now = datetime.now(tz=timezone.utc)
response = make_query_request(
signoz,
token,
start_ms=int((now - timedelta(seconds=30)).timestamp() * 1000),
end_ms=int(now.timestamp() * 1000),
queries=[
BuilderQuery(
signal="logs",
limit=100,
order=[build_order_by("timestamp"), build_order_by("id")],
).to_dict()
],
request_type="raw",
)
assert response.status_code == HTTPStatus.OK
rows = response.json()["data"]["data"]["results"][0].get("rows") or []
audit_bodies = [
row["data"]["body"]
for row in rows
if "signoz.audit"
in row["data"].get("attributes_string", {}).get("signoz.audit.action", "")
]
assert len(audit_bodies) == 0