Compare commits

...

28 Commits

Author SHA1 Message Date
Piyush Singariya
7e7732243e fix: dynamic array tests 2026-04-01 12:54:26 +05:30
Piyush Singariya
2f952e402f feat: change filtering of dynamic arrays 2026-04-01 12:09:32 +05:30
Piyush Singariya
a12febca4a fix: array json element comparison 2026-04-01 10:34:55 +05:30
Piyush Singariya
cb71c9c3f7 Merge branch 'main' into fix/array-json 2026-03-31 15:42:09 +05:30
Piyush Singariya
1cd4ce6509 Merge branch 'main' into fix/array-json 2026-03-31 14:55:36 +05:30
Vikrant Gupta
2163e1ce41 chore(lint): enable godot and staticcheck (#10775)
* chore(lint): enable godot and staticcheck

* chore(lint): merge main and fix new lint issues in main
2026-03-31 09:11:49 +00:00
Srikanth Chekuri
b9eecacab7 chore: remove v1 metrics explorer code (#10764)
* chore: remove v1 metrics explorer code

* chore: fix ci

* chore: fix tests

* chore: address review comments
2026-03-31 08:46:27 +00:00
Tushar Vats
13982033dc fix: handle empty not() expression (#10165)
* fix: handle empty not() expression

* fix: handle more cases

* fix: short circuit conditions and updated unit tests

* fix: revert commented code

* fix: added more unit tests

* fix: added integration tests

* fix: make py-lint and make py-fmt

* fix: moved from traces to logs for testing full text search

* fix: simplify code

* fix: added more unit tests

* fix: addressed comments

* fix: update comment

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>

* fix: update unit test

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>

* fix: update unit test

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>

* fix: instead of using true, using a skip literal

* fix: unit test

* fix: update integration test

* fix: update unit for relevance

* fix: lint error

* fix: added a new literal for error condition, added more unit tests

* fix: merge issues

* fix: inline comments

* fix: update unit tests merging from main

* fix: make py-fmt and make py-lint

* fix: type handling

---------

Co-authored-by: Srikanth Chekuri <srikanth.chekuri92@gmail.com>
2026-03-31 08:44:07 +00:00
Piyush Singariya
b198cfc11a chore: enable JSON Path index in JSON Logs (#10736)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
* feat: enable JSON Path index

* fix: contextual path index usage

* test: fix unit tests
2026-03-31 08:07:27 +00:00
Amaresh S M
7b6f77bd52 fix(devenv): fix otel-collector startup failure (#10620)
Co-authored-by: Nageshbansal <76246968+Nageshbansal@users.noreply.github.com>
2026-03-31 07:08:41 +00:00
Pandey
e588c57e44 feat(audit): add noop auditor for community edition (#10769)
* feat(audit): add noop auditor and embed ServiceWithHealthy in Auditor

Embed factory.ServiceWithHealthy in the Auditor interface so all
providers (noop and future OTLP HTTP) share a uniform lifecycle
contract. Add pkg/auditor/noopauditor for the community edition
that silently discards all events with zero allocations.

* feat(audit): remove noopauditor test file
2026-03-31 06:16:20 +00:00
Ayush Shukla
98f53423dc docs: fix typo 'versinoing' -> 'versioning' in frontend README (#10765) 2026-03-31 05:05:23 +00:00
Pandey
e41d400aa0 feat(audit): add Auditor interface and rename auditortypes to audittypes (#10766)
Some checks failed
build-staging / prepare (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
* feat(audit): add Auditor interface and rename auditortypes to audittypes

Add the Auditor interface in pkg/auditor/ as the contract between the
HTTP handler layer and audit implementations (noop for community, OTLP
for enterprise). Includes Config with buffer, batch, and flush settings
following the provider pattern.

Rename pkg/types/auditortypes/ to pkg/types/audittypes/ for consistency.

closes SigNoz/platform-pod#1930

* refactor(audit): move endpoint config to OTLPHTTPConfig provider struct

Move Endpoint out of top-level Config into a provider-specific
OTLPHTTPConfig struct with full OTLP HTTP options (url_path, insecure,
compression, timeout, headers, retry). Keep BufferSize, BatchSize,
FlushInterval at top level as common settings across providers.

closes SigNoz/platform-pod#1930

* feat(audit): add auditorbatcher for buffered event batching

Add pkg/auditor/auditorbatcher/ with channel-based batching for audit
events. Flushes when either batch size is reached or flush interval
elapses (whichever comes first). Events are dropped when the buffer is
full (fail-open). Follows the alertmanagerbatcher pattern.

* refactor(audit): replace public channel with Receive method on batcher

Make batchC private and expose Receive() <-chan []AuditEvent as the
read-side API. Clearer contract: Add() to write, Receive() to read.

* refactor(audit): rename batcher config fields to BufferSize and BatchSize

Capacity → BufferSize, Size → BatchSize for clarity.

* fix(audit): single-line slog call and fix log key to buffer_size

* feat(audit): add OTel metrics to auditorbatcher

Add telemetry via OTel MeterProvider with 4 instruments:
- signoz.audit.events.emitted (counter)
- signoz.audit.store.write_errors (counter, via RecordWriteError)
- signoz.audit.events.dropped (counter)
- signoz.audit.events.buffer_size (observable gauge)

Batcher.New() now accepts metric.Meter and returns error.

* refactor(audit): inject factory.ScopedProviderSettings into batcher

Replace separate logger and meter params with ScopedProviderSettings,
giving the batcher access to logger, meter, and tracer from one source.

* feat(audit): add OTel tracing to batcher Add path

Span auditorbatcher.Add with event_name attribute set at creation
and audit.dropped set dynamically on buffer-full drop.

* feat(audit): add export span to batcher Receive path

Introduce Batch struct carrying context, events, and a trace span.
Each flush starts an auditorbatcher.Export span with batch_size
attribute. The consumer ends the span after export completes.

* refactor(audit): replace Batch/Receive with ExportFunc callback

Batcher now takes an ExportFunc at construction and manages spans
internally. Removes Batch struct, Receive(), and RecordWriteError()
from the public API. Span.End() is always called via defer, write
errors and span status are recorded automatically on export failure.
Uses errors.Attr for error logging, prefixes log keys with audit.

* refactor(audit): rename auditorbatcher to auditorserver

Rename package, file (batcher.go → server.go), type (Batcher → Server),
and receiver (b → server) to reflect the full service role: buffering,
batching, metrics, tracing, and export lifecycle management.

* refactor(audit): rename telemetry to serverMetrics and document struct fields

Rename type telemetry → serverMetrics and constructor
newTelemetry → newServerMetrics. Add comments to all Server
struct fields.

* feat(audit): implement ServiceWithHealthy, fix race, add unit tests

- Implement factory.ServiceWithHealthy on Server via healthyC channel
- Fix data race: Start closes healthyC after goroutinesWg.Add(1),
  Stop waits on healthyC before closing stopC
- Add 8 unit tests covering construction, start/stop lifecycle,
  batch-size flush, interval flush, buffer-full drop, drain on stop,
  export failure handling, and concurrent safety

* fix(audit): fix lint issues in auditorserver and tests

Use snake_case for slog keys, errors.New instead of fmt.Errorf,
and check all Start/Stop return values in tests.

* fix(audit): address PR review comments

Use auditor:: prefix in validation error messages. Move fire-and-forget
comment to the Audit method, remove interface-level comment.
2026-03-30 20:46:38 +00:00
Vikrant Gupta
d19592ce7b chore(authz): bump up openfga version (#10767)
* chore(authz): bump up openfga version

* chore(authz): fix tests

* chore(authz): bump up openfga version

* chore(authz): remove ee references
2026-03-30 20:44:11 +00:00
Phuc
1f43feaf3c refactor(time-range): replace hardcoded 30m defaults with const (#10748)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
* refactor: replace hardcoded 30m defaults with DEFAULT_TIME_RANGE

* refactor: use DEFAULT_TIME_RANGE in home.tsx

* revert: restore mock data before DEFAULT_TIME_RANGE changes
2026-03-30 15:49:01 +00:00
Piyush Singariya
9299c8ab18 fix: indexed unit tests 2026-03-30 15:47:47 +05:30
Piyush Singariya
24749de269 fix: comment 2026-03-30 15:16:28 +05:30
Piyush Singariya
39098ec3f4 fix: unit tests 2026-03-30 15:12:17 +05:30
Piyush Singariya
fe554f5c94 fix: remove not used paths from testdata 2026-03-30 14:24:48 +05:30
Piyush Singariya
8a60a041a6 fix: unit tests 2026-03-30 14:14:49 +05:30
Piyush Singariya
541f19c34a fix: array type filtering from dynamic arrays 2026-03-30 12:59:31 +05:30
Piyush Singariya
010db03d6e fix: indexed tests passing 2026-03-30 12:24:26 +05:30
Piyush Singariya
5408acbd8c fix: primitive conditions working 2026-03-30 12:01:35 +05:30
Piyush Singariya
0de6c85f81 feat: align negative operators to include other logs 2026-03-28 10:30:11 +05:30
Piyush Singariya
69ec24fa05 test: fix unit tests 2026-03-27 15:12:49 +05:30
Piyush Singariya
539d732b65 fix: contextual path index usage 2026-03-27 14:44:51 +05:30
Piyush Singariya
843d5fb199 Merge branch 'main' into feat/json-index 2026-03-27 14:17:52 +05:30
Piyush Singariya
fabdfb8cc1 feat: enable JSON Path index 2026-03-27 14:07:37 +05:30
288 changed files with 15731 additions and 5710 deletions

View File

@@ -27,8 +27,8 @@ services:
- ${PWD}/fs/tmp/var/lib/clickhouse/user_scripts/:/var/lib/clickhouse/user_scripts/
- ${PWD}/../../../deploy/common/clickhouse/custom-function.xml:/etc/clickhouse-server/custom-function.xml
ports:
- '127.0.0.1:8123:8123'
- '127.0.0.1:9000:9000'
- "127.0.0.1:8123:8123"
- "127.0.0.1:9000:9000"
tty: true
healthcheck:
test:
@@ -47,13 +47,16 @@ services:
condition: service_healthy
environment:
- CLICKHOUSE_SKIP_USER_SETUP=1
networks:
- default
- signoz-devenv
zookeeper:
image: signoz/zookeeper:3.7.1
container_name: zookeeper
volumes:
- ${PWD}/fs/tmp/zookeeper:/bitnami/zookeeper
ports:
- '127.0.0.1:2181:2181'
- "127.0.0.1:2181:2181"
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
healthcheck:
@@ -74,12 +77,19 @@ services:
entrypoint:
- /bin/sh
command:
- -c
- |
/signoz-otel-collector migrate bootstrap &&
/signoz-otel-collector migrate sync up &&
/signoz-otel-collector migrate async up
- -c
- |
/signoz-otel-collector migrate bootstrap &&
/signoz-otel-collector migrate sync up &&
/signoz-otel-collector migrate async up
depends_on:
clickhouse:
condition: service_healthy
restart: on-failure
networks:
- default
- signoz-devenv
networks:
signoz-devenv:
name: signoz-devenv

View File

@@ -3,7 +3,7 @@ services:
image: signoz/signoz-otel-collector:v0.142.0
container_name: signoz-otel-collector-dev
entrypoint:
- /bin/sh
- /bin/sh
command:
- -c
- |
@@ -34,4 +34,11 @@ services:
retries: 3
restart: unless-stopped
extra_hosts:
- "host.docker.internal:host-gateway"
- "host.docker.internal:host-gateway"
networks:
- default
- signoz-devenv
networks:
signoz-devenv:
name: signoz-devenv

View File

@@ -12,10 +12,10 @@ receivers:
scrape_configs:
- job_name: otel-collector
static_configs:
- targets:
- localhost:8888
labels:
job_name: otel-collector
- targets:
- localhost:8888
labels:
job_name: otel-collector
processors:
batch:
@@ -29,7 +29,26 @@ processors:
signozspanmetrics/delta:
metrics_exporter: signozclickhousemetrics
metrics_flush_interval: 60s
latency_histogram_buckets: [100us, 1ms, 2ms, 6ms, 10ms, 50ms, 100ms, 250ms, 500ms, 1000ms, 1400ms, 2000ms, 5s, 10s, 20s, 40s, 60s ]
latency_histogram_buckets:
[
100us,
1ms,
2ms,
6ms,
10ms,
50ms,
100ms,
250ms,
500ms,
1000ms,
1400ms,
2000ms,
5s,
10s,
20s,
40s,
60s,
]
dimensions_cache_size: 100000
aggregation_temporality: AGGREGATION_TEMPORALITY_DELTA
enable_exp_histogram: true
@@ -60,13 +79,13 @@ extensions:
exporters:
clickhousetraces:
datasource: tcp://host.docker.internal:9000/signoz_traces
datasource: tcp://clickhouse:9000/signoz_traces
low_cardinal_exception_grouping: ${env:LOW_CARDINAL_EXCEPTION_GROUPING}
use_new_schema: true
signozclickhousemetrics:
dsn: tcp://host.docker.internal:9000/signoz_metrics
dsn: tcp://clickhouse:9000/signoz_metrics
clickhouselogsexporter:
dsn: tcp://host.docker.internal:9000/signoz_logs
dsn: tcp://clickhouse:9000/signoz_logs
timeout: 10s
use_new_schema: true
@@ -93,4 +112,4 @@ service:
logs:
receivers: [otlp]
processors: [batch]
exporters: [clickhouselogsexporter]
exporters: [clickhouselogsexporter]

View File

@@ -6,12 +6,14 @@ linters:
- depguard
- errcheck
- forbidigo
- godot
- govet
- iface
- ineffassign
- misspell
- nilnil
- sloglint
- staticcheck
- wastedassign
- unparam
- unused

View File

@@ -12,6 +12,7 @@ import (
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/authz/openfgaauthz"
"github.com/SigNoz/signoz/pkg/authz/openfgaschema"
"github.com/SigNoz/signoz/pkg/authz/openfgaserver"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/gateway"
@@ -78,8 +79,13 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
func(ctx context.Context, providerSettings factory.ProviderSettings, store authtypes.AuthNStore, licensing licensing.Licensing) (map[authtypes.AuthNProvider]authn.AuthN, error) {
return signoz.NewAuthNs(ctx, providerSettings, store, licensing)
},
func(ctx context.Context, sqlstore sqlstore.SQLStore, _ licensing.Licensing, _ dashboard.Module) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx))
func(ctx context.Context, sqlstore sqlstore.SQLStore, _ licensing.Licensing, _ dashboard.Module) (factory.ProviderFactory[authz.AuthZ, authz.Config], error) {
openfgaDataStore, err := openfgaserver.NewSQLStore(sqlstore)
if err != nil {
return nil, err
}
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx), openfgaDataStore), nil
},
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, queryParser queryparser.QueryParser, _ querier.Querier, _ licensing.Licensing) dashboard.Module {
return impldashboard.NewModule(impldashboard.NewStore(store), settings, analytics, orgGetter, queryParser)

View File

@@ -12,6 +12,7 @@ import (
"github.com/SigNoz/signoz/ee/authn/callbackauthn/samlcallbackauthn"
"github.com/SigNoz/signoz/ee/authz/openfgaauthz"
"github.com/SigNoz/signoz/ee/authz/openfgaschema"
"github.com/SigNoz/signoz/ee/authz/openfgaserver"
"github.com/SigNoz/signoz/ee/gateway/httpgateway"
enterpriselicensing "github.com/SigNoz/signoz/ee/licensing"
"github.com/SigNoz/signoz/ee/licensing/httplicensing"
@@ -118,8 +119,13 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
return authNs, nil
},
func(ctx context.Context, sqlstore sqlstore.SQLStore, licensing licensing.Licensing, dashboardModule dashboard.Module) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx), licensing, dashboardModule)
func(ctx context.Context, sqlstore sqlstore.SQLStore, licensing licensing.Licensing, dashboardModule dashboard.Module) (factory.ProviderFactory[authz.AuthZ, authz.Config], error) {
openfgaDataStore, err := openfgaserver.NewSQLStore(sqlstore)
if err != nil {
return nil, err
}
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx), openfgaDataStore, licensing, dashboardModule), nil
},
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, queryParser queryparser.QueryParser, querier querier.Querier, licensing licensing.Licensing) dashboard.Module {
return impldashboard.NewModule(pkgimpldashboard.NewStore(store), settings, analytics, orgGetter, queryParser, querier, licensing)

View File

@@ -1114,6 +1114,33 @@ components:
enabled:
type: boolean
type: object
MetricsexplorertypesInspectMetricsRequest:
properties:
end:
format: int64
type: integer
filter:
$ref: '#/components/schemas/Querybuildertypesv5Filter'
metricName:
type: string
start:
format: int64
type: integer
required:
- metricName
- start
- end
type: object
MetricsexplorertypesInspectMetricsResponse:
properties:
series:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeries'
nullable: true
type: array
required:
- series
type: object
MetricsexplorertypesListMetric:
properties:
description:
@@ -1262,6 +1289,13 @@ components:
- temporality
- isMonotonic
type: object
MetricsexplorertypesMetricsOnboardingResponse:
properties:
hasMetrics:
type: boolean
required:
- hasMetrics
type: object
MetricsexplorertypesStat:
properties:
description:
@@ -7750,6 +7784,111 @@ paths:
summary: Update metric metadata
tags:
- metrics
/api/v2/metrics/inspect:
post:
deprecated: false
description: Returns raw time series data points for a metric within a time
range (max 30 minutes). Each series includes labels and timestamp/value pairs.
operationId: InspectMetrics
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/MetricsexplorertypesInspectMetricsRequest'
responses:
"200":
content:
application/json:
schema:
properties:
data:
$ref: '#/components/schemas/MetricsexplorertypesInspectMetricsResponse'
status:
type: string
required:
- status
- data
type: object
description: OK
"400":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Bad Request
"401":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Unauthorized
"403":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Forbidden
"500":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Internal Server Error
security:
- api_key:
- VIEWER
- tokenizer:
- VIEWER
summary: Inspect raw metric data points
tags:
- metrics
/api/v2/metrics/onboarding:
get:
deprecated: false
description: Lightweight endpoint that checks if any non-SigNoz metrics have
been ingested, used for onboarding status detection
operationId: GetMetricsOnboardingStatus
responses:
"200":
content:
application/json:
schema:
properties:
data:
$ref: '#/components/schemas/MetricsexplorertypesMetricsOnboardingResponse'
status:
type: string
required:
- status
- data
type: object
description: OK
"401":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Unauthorized
"403":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Forbidden
"500":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Internal Server Error
security:
- api_key:
- VIEWER
- tokenizer:
- VIEWER
summary: Check if non-SigNoz metrics have been received
tags:
- metrics
/api/v2/metrics/stats:
post:
deprecated: false

View File

@@ -16,7 +16,7 @@ func (hp *HourlyProvider) GetBaseSeasonalProvider() *BaseSeasonalProvider {
return &hp.BaseSeasonalProvider
}
// NewHourlyProvider now uses the generic option type
// NewHourlyProvider now uses the generic option type.
func NewHourlyProvider(opts ...GenericProviderOption[*HourlyProvider]) *HourlyProvider {
hp := &HourlyProvider{
BaseSeasonalProvider: BaseSeasonalProvider{},

View File

@@ -47,7 +47,7 @@ type AnomaliesResponse struct {
// | |
// (rounded value for past peiod) + (seasonal growth)
//
// score = abs(value - prediction) / stddev (current_season_query)
// score = abs(value - prediction) / stddev (current_season_query).
type anomalyQueryParams struct {
// CurrentPeriodQuery is the query range params for period user is looking at or eval window
// Example: (now-5m, now), (now-30m, now), (now-1h, now)

View File

@@ -18,12 +18,12 @@ var (
movingAvgWindowSize = 7
)
// BaseProvider is an interface that includes common methods for all provider types
// BaseProvider is an interface that includes common methods for all provider types.
type BaseProvider interface {
GetBaseSeasonalProvider() *BaseSeasonalProvider
}
// GenericProviderOption is a generic type for provider options
// GenericProviderOption is a generic type for provider options.
type GenericProviderOption[T BaseProvider] func(T)
func WithQuerier[T BaseProvider](querier querier.Querier) GenericProviderOption[T] {
@@ -121,7 +121,7 @@ func (p *BaseSeasonalProvider) getResults(ctx context.Context, orgID valuer.UUID
}
// getMatchingSeries gets the matching series from the query result
// for the given series
// for the given series.
func (p *BaseSeasonalProvider) getMatchingSeries(_ context.Context, queryResult *qbtypes.TimeSeriesData, series *qbtypes.TimeSeries) *qbtypes.TimeSeries {
if queryResult == nil || len(queryResult.Aggregations) == 0 || len(queryResult.Aggregations[0].Series) == 0 {
return nil
@@ -155,13 +155,14 @@ func (p *BaseSeasonalProvider) getStdDev(series *qbtypes.TimeSeries) float64 {
avg := p.getAvg(series)
var sum float64
for _, smpl := range series.Values {
sum += math.Pow(smpl.Value-avg, 2)
d := smpl.Value - avg
sum += d * d
}
return math.Sqrt(sum / float64(len(series.Values)))
}
// getMovingAvg gets the moving average for the given series
// for the given window size and start index
// for the given window size and start index.
func (p *BaseSeasonalProvider) getMovingAvg(series *qbtypes.TimeSeries, movingAvgWindowSize, startIdx int) float64 {
if series == nil || len(series.Values) == 0 {
return 0
@@ -236,7 +237,7 @@ func (p *BaseSeasonalProvider) getPredictedSeries(
// getBounds gets the upper and lower bounds for the given series
// for the given z score threshold
// moving avg of the previous period series + z score threshold * std dev of the series
// moving avg of the previous period series - z score threshold * std dev of the series
// moving avg of the previous period series - z score threshold * std dev of the series.
func (p *BaseSeasonalProvider) getBounds(
series, predictedSeries, weekSeries *qbtypes.TimeSeries,
zScoreThreshold float64,
@@ -269,7 +270,7 @@ func (p *BaseSeasonalProvider) getBounds(
// getExpectedValue gets the expected value for the given series
// for the given index
// prevSeriesAvg + currentSeasonSeriesAvg - mean of past season series, past2 season series and past3 season series
// prevSeriesAvg + currentSeasonSeriesAvg - mean of past season series, past2 season series and past3 season series.
func (p *BaseSeasonalProvider) getExpectedValue(
_, prevSeries, currentSeasonSeries, pastSeasonSeries, past2SeasonSeries, past3SeasonSeries *qbtypes.TimeSeries, idx int,
) float64 {
@@ -283,7 +284,7 @@ func (p *BaseSeasonalProvider) getExpectedValue(
// getScore gets the anomaly score for the given series
// for the given index
// (value - expectedValue) / std dev of the series
// (value - expectedValue) / std dev of the series.
func (p *BaseSeasonalProvider) getScore(
series, prevSeries, weekSeries, weekPrevSeries, past2SeasonSeries, past3SeasonSeries *qbtypes.TimeSeries, value float64, idx int,
) float64 {
@@ -296,7 +297,7 @@ func (p *BaseSeasonalProvider) getScore(
// getAnomalyScores gets the anomaly scores for the given series
// for the given index
// (value - expectedValue) / std dev of the series
// (value - expectedValue) / std dev of the series.
func (p *BaseSeasonalProvider) getAnomalyScores(
series, prevSeries, currentSeasonSeries, pastSeasonSeries, past2SeasonSeries, past3SeasonSeries *qbtypes.TimeSeries,
) *qbtypes.TimeSeries {

View File

@@ -16,6 +16,7 @@ import (
"github.com/SigNoz/signoz/pkg/valuer"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
openfgapkgtransformer "github.com/openfga/language/pkg/go/transformer"
"github.com/openfga/openfga/pkg/storage"
)
type provider struct {
@@ -26,14 +27,14 @@ type provider struct {
registry []authz.RegisterTypeable
}
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, licensing licensing.Licensing, registry ...authz.RegisterTypeable) factory.ProviderFactory[authz.AuthZ, authz.Config] {
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, openfgaDataStore storage.OpenFGADatastore, licensing licensing.Licensing, registry ...authz.RegisterTypeable) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return factory.NewProviderFactory(factory.MustNewName("openfga"), func(ctx context.Context, ps factory.ProviderSettings, config authz.Config) (authz.AuthZ, error) {
return newOpenfgaProvider(ctx, ps, config, sqlstore, openfgaSchema, licensing, registry)
return newOpenfgaProvider(ctx, ps, config, sqlstore, openfgaSchema, openfgaDataStore, licensing, registry)
})
}
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, licensing licensing.Licensing, registry []authz.RegisterTypeable) (authz.AuthZ, error) {
pkgOpenfgaAuthzProvider := pkgopenfgaauthz.NewProviderFactory(sqlstore, openfgaSchema)
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, openfgaDataStore storage.OpenFGADatastore, licensing licensing.Licensing, registry []authz.RegisterTypeable) (authz.AuthZ, error) {
pkgOpenfgaAuthzProvider := pkgopenfgaauthz.NewProviderFactory(sqlstore, openfgaSchema, openfgaDataStore)
pkgAuthzService, err := pkgOpenfgaAuthzProvider.New(ctx, settings, config)
if err != nil {
return nil, err

View File

@@ -0,0 +1,32 @@
package openfgaserver
import (
"github.com/SigNoz/signoz/ee/sqlstore/postgressqlstore"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/openfga/openfga/pkg/storage"
"github.com/openfga/openfga/pkg/storage/postgres"
"github.com/openfga/openfga/pkg/storage/sqlcommon"
"github.com/openfga/openfga/pkg/storage/sqlite"
)
func NewSQLStore(store sqlstore.SQLStore) (storage.OpenFGADatastore, error) {
switch store.BunDB().Dialect().Name().String() {
case "sqlite":
return sqlite.NewWithDB(store.SQLDB(), &sqlcommon.Config{
MaxTuplesPerWriteField: 100,
MaxTypesPerModelField: 100,
})
case "pg":
pgStore, ok := store.(postgressqlstore.Pooler)
if !ok {
panic(errors.New(errors.TypeInternal, errors.CodeInternal, "postgressqlstore should implement Pooler"))
}
return postgres.NewWithDB(pgStore.Pool(), nil, &sqlcommon.Config{
MaxTuplesPerWriteField: 100,
MaxTypesPerModelField: 100,
})
}
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "invalid store type: %s", store.BunDB().Dialect().Name().String())
}

View File

@@ -13,7 +13,7 @@ var (
once sync.Once
)
// initializes the licensing configuration
// Config initializes the licensing configuration.
func Config(pollInterval time.Duration, failureThreshold int) licensing.Config {
once.Do(func() {
config = licensing.Config{PollInterval: pollInterval, FailureThreshold: failureThreshold}

View File

@@ -79,7 +79,7 @@ func (h *handler) QueryRange(rw http.ResponseWriter, req *http.Request) {
// Build step intervals from the anomaly query
stepIntervals := make(map[string]uint64)
if anomalyQuery.StepInterval.Duration > 0 {
stepIntervals[anomalyQuery.Name] = uint64(anomalyQuery.StepInterval.Duration.Seconds())
stepIntervals[anomalyQuery.Name] = uint64(anomalyQuery.StepInterval.Seconds())
}
finalResp := &qbtypes.QueryRangeResponse{

View File

@@ -242,7 +242,6 @@ func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*h
apiHandler.RegisterWebSocketPaths(r, am)
apiHandler.RegisterMessagingQueuesRoutes(r, am)
apiHandler.RegisterThirdPartyApiRoutes(r, am)
apiHandler.MetricExplorerRoutes(r, am)
apiHandler.RegisterTraceFunnelsRoutes(r, am)
err := s.signoz.APIServer.AddToRouter(r)

View File

@@ -336,9 +336,10 @@ func (dialect *dialect) UpdatePrimaryKey(ctx context.Context, bun bun.IDB, oldMo
}
fkReference := ""
if reference == Org {
switch reference {
case Org:
fkReference = OrgReference
} else if reference == User {
case User:
fkReference = UserReference
}
@@ -392,9 +393,10 @@ func (dialect *dialect) AddPrimaryKey(ctx context.Context, bun bun.IDB, oldModel
}
fkReference := ""
if reference == Org {
switch reference {
case Org:
fkReference = OrgReference
} else if reference == User {
case User:
fkReference = UserReference
}

View File

@@ -14,14 +14,21 @@ import (
"github.com/uptrace/bun/dialect/pgdialect"
)
var _ Pooler = new(provider)
type provider struct {
settings factory.ScopedProviderSettings
sqldb *sql.DB
bundb *sqlstore.BunDB
pgxPool *pgxpool.Pool
dialect *dialect
formatter sqlstore.SQLFormatter
}
type Pooler interface {
Pool() *pgxpool.Pool
}
func NewFactory(hookFactories ...factory.ProviderFactory[sqlstore.SQLStoreHook, sqlstore.Config]) factory.ProviderFactory[sqlstore.SQLStore, sqlstore.Config] {
return factory.NewProviderFactory(factory.MustNewName("postgres"), func(ctx context.Context, providerSettings factory.ProviderSettings, config sqlstore.Config) (sqlstore.SQLStore, error) {
hooks := make([]sqlstore.SQLStoreHook, len(hookFactories))
@@ -62,6 +69,7 @@ func New(ctx context.Context, providerSettings factory.ProviderSettings, config
settings: settings,
sqldb: sqldb,
bundb: bunDB,
pgxPool: pool,
dialect: new(dialect),
formatter: newFormatter(bunDB.Dialect()),
}, nil
@@ -75,6 +83,10 @@ func (provider *provider) SQLDB() *sql.DB {
return provider.sqldb
}
func (provider *provider) Pool() *pgxpool.Pool {
return provider.pgxPool
}
func (provider *provider) Dialect() sqlstore.SQLDialect {
return provider.dialect
}

View File

@@ -19,7 +19,7 @@ var (
once sync.Once
)
// initializes the Zeus configuration
// initializes the Zeus configuration.
func Config() zeus.Config {
once.Do(func() {
parsedURL, err := neturl.Parse(url)

View File

@@ -189,7 +189,7 @@ func (provider *Provider) do(ctx context.Context, url *url.URL, method string, k
return nil, provider.errFromStatusCode(response.StatusCode, errorMessage)
}
// This can be taken down to the client package
// This can be taken down to the client package.
func (provider *Provider) errFromStatusCode(statusCode int, errorMessage string) error {
switch statusCode {
case http.StatusBadRequest:

View File

@@ -12,7 +12,7 @@
or
`docker build . -t tagname`
**Tag to remote url- Introduce versinoing later on**
**Tag to remote url- Introduce versioning later on**
```
docker tag signoz/frontend:latest 7296823551/signoz:latest

View File

@@ -1,6 +1,7 @@
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
import { CreatePublicDashboardProps } from 'types/api/dashboard/public/create';
@@ -8,7 +9,7 @@ const createPublicDashboard = async (
props: CreatePublicDashboardProps,
): Promise<SuccessResponseV2<CreatePublicDashboardProps>> => {
const { dashboardId, timeRangeEnabled = false, defaultTimeRange = '30m' } = props;
const { dashboardId, timeRangeEnabled = false, defaultTimeRange = DEFAULT_TIME_RANGE } = props;
try {
const response = await axios.post(

View File

@@ -1,6 +1,7 @@
import axios from 'api';
import { ErrorResponseHandlerV2 } from 'api/ErrorResponseHandlerV2';
import { AxiosError } from 'axios';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import { ErrorV2Resp, SuccessResponseV2 } from 'types/api';
import { UpdatePublicDashboardProps } from 'types/api/dashboard/public/update';
@@ -8,7 +9,7 @@ const updatePublicDashboard = async (
props: UpdatePublicDashboardProps,
): Promise<SuccessResponseV2<UpdatePublicDashboardProps>> => {
const { dashboardId, timeRangeEnabled = false, defaultTimeRange = '30m' } = props;
const { dashboardId, timeRangeEnabled = false, defaultTimeRange = DEFAULT_TIME_RANGE } = props;
try {
const response = await axios.put(

View File

@@ -31,10 +31,13 @@ import type {
GetMetricHighlightsPathParameters,
GetMetricMetadata200,
GetMetricMetadataPathParameters,
GetMetricsOnboardingStatus200,
GetMetricsStats200,
GetMetricsTreemap200,
InspectMetrics200,
ListMetrics200,
ListMetricsParams,
MetricsexplorertypesInspectMetricsRequestDTO,
MetricsexplorertypesStatsRequestDTO,
MetricsexplorertypesTreemapRequestDTO,
MetricsexplorertypesUpdateMetricMetadataRequestDTO,
@@ -778,6 +781,176 @@ export const useUpdateMetricMetadata = <
return useMutation(mutationOptions);
};
/**
* Returns raw time series data points for a metric within a time range (max 30 minutes). Each series includes labels and timestamp/value pairs.
* @summary Inspect raw metric data points
*/
export const inspectMetrics = (
metricsexplorertypesInspectMetricsRequestDTO: BodyType<MetricsexplorertypesInspectMetricsRequestDTO>,
signal?: AbortSignal,
) => {
return GeneratedAPIInstance<InspectMetrics200>({
url: `/api/v2/metrics/inspect`,
method: 'POST',
headers: { 'Content-Type': 'application/json' },
data: metricsexplorertypesInspectMetricsRequestDTO,
signal,
});
};
export const getInspectMetricsMutationOptions = <
TError = ErrorType<RenderErrorResponseDTO>,
TContext = unknown
>(options?: {
mutation?: UseMutationOptions<
Awaited<ReturnType<typeof inspectMetrics>>,
TError,
{ data: BodyType<MetricsexplorertypesInspectMetricsRequestDTO> },
TContext
>;
}): UseMutationOptions<
Awaited<ReturnType<typeof inspectMetrics>>,
TError,
{ data: BodyType<MetricsexplorertypesInspectMetricsRequestDTO> },
TContext
> => {
const mutationKey = ['inspectMetrics'];
const { mutation: mutationOptions } = options
? options.mutation &&
'mutationKey' in options.mutation &&
options.mutation.mutationKey
? options
: { ...options, mutation: { ...options.mutation, mutationKey } }
: { mutation: { mutationKey } };
const mutationFn: MutationFunction<
Awaited<ReturnType<typeof inspectMetrics>>,
{ data: BodyType<MetricsexplorertypesInspectMetricsRequestDTO> }
> = (props) => {
const { data } = props ?? {};
return inspectMetrics(data);
};
return { mutationFn, ...mutationOptions };
};
export type InspectMetricsMutationResult = NonNullable<
Awaited<ReturnType<typeof inspectMetrics>>
>;
export type InspectMetricsMutationBody = BodyType<MetricsexplorertypesInspectMetricsRequestDTO>;
export type InspectMetricsMutationError = ErrorType<RenderErrorResponseDTO>;
/**
* @summary Inspect raw metric data points
*/
export const useInspectMetrics = <
TError = ErrorType<RenderErrorResponseDTO>,
TContext = unknown
>(options?: {
mutation?: UseMutationOptions<
Awaited<ReturnType<typeof inspectMetrics>>,
TError,
{ data: BodyType<MetricsexplorertypesInspectMetricsRequestDTO> },
TContext
>;
}): UseMutationResult<
Awaited<ReturnType<typeof inspectMetrics>>,
TError,
{ data: BodyType<MetricsexplorertypesInspectMetricsRequestDTO> },
TContext
> => {
const mutationOptions = getInspectMetricsMutationOptions(options);
return useMutation(mutationOptions);
};
/**
* Lightweight endpoint that checks if any non-SigNoz metrics have been ingested, used for onboarding status detection
* @summary Check if non-SigNoz metrics have been received
*/
export const getMetricsOnboardingStatus = (signal?: AbortSignal) => {
return GeneratedAPIInstance<GetMetricsOnboardingStatus200>({
url: `/api/v2/metrics/onboarding`,
method: 'GET',
signal,
});
};
export const getGetMetricsOnboardingStatusQueryKey = () => {
return [`/api/v2/metrics/onboarding`] as const;
};
export const getGetMetricsOnboardingStatusQueryOptions = <
TData = Awaited<ReturnType<typeof getMetricsOnboardingStatus>>,
TError = ErrorType<RenderErrorResponseDTO>
>(options?: {
query?: UseQueryOptions<
Awaited<ReturnType<typeof getMetricsOnboardingStatus>>,
TError,
TData
>;
}) => {
const { query: queryOptions } = options ?? {};
const queryKey =
queryOptions?.queryKey ?? getGetMetricsOnboardingStatusQueryKey();
const queryFn: QueryFunction<
Awaited<ReturnType<typeof getMetricsOnboardingStatus>>
> = ({ signal }) => getMetricsOnboardingStatus(signal);
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
Awaited<ReturnType<typeof getMetricsOnboardingStatus>>,
TError,
TData
> & { queryKey: QueryKey };
};
export type GetMetricsOnboardingStatusQueryResult = NonNullable<
Awaited<ReturnType<typeof getMetricsOnboardingStatus>>
>;
export type GetMetricsOnboardingStatusQueryError = ErrorType<RenderErrorResponseDTO>;
/**
* @summary Check if non-SigNoz metrics have been received
*/
export function useGetMetricsOnboardingStatus<
TData = Awaited<ReturnType<typeof getMetricsOnboardingStatus>>,
TError = ErrorType<RenderErrorResponseDTO>
>(options?: {
query?: UseQueryOptions<
Awaited<ReturnType<typeof getMetricsOnboardingStatus>>,
TError,
TData
>;
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
const queryOptions = getGetMetricsOnboardingStatusQueryOptions(options);
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
queryKey: QueryKey;
};
query.queryKey = queryOptions.queryKey;
return query;
}
/**
* @summary Check if non-SigNoz metrics have been received
*/
export const invalidateGetMetricsOnboardingStatus = async (
queryClient: QueryClient,
options?: InvalidateOptions,
): Promise<QueryClient> => {
await queryClient.invalidateQueries(
{ queryKey: getGetMetricsOnboardingStatusQueryKey() },
options,
);
return queryClient;
};
/**
* This endpoint provides list of metrics with their number of samples and timeseries for the given time range
* @summary Get metrics statistics

View File

@@ -1363,6 +1363,32 @@ export interface GlobaltypesTokenizerConfigDTO {
enabled?: boolean;
}
export interface MetricsexplorertypesInspectMetricsRequestDTO {
/**
* @type integer
* @format int64
*/
end: number;
filter?: Querybuildertypesv5FilterDTO;
/**
* @type string
*/
metricName: string;
/**
* @type integer
* @format int64
*/
start: number;
}
export interface MetricsexplorertypesInspectMetricsResponseDTO {
/**
* @type array
* @nullable true
*/
series: Querybuildertypesv5TimeSeriesDTO[] | null;
}
export interface MetricsexplorertypesListMetricDTO {
/**
* @type string
@@ -1508,6 +1534,13 @@ export interface MetricsexplorertypesMetricMetadataDTO {
unit: string;
}
export interface MetricsexplorertypesMetricsOnboardingResponseDTO {
/**
* @type boolean
*/
hasMetrics: boolean;
}
export interface MetricsexplorertypesStatDTO {
/**
* @type string
@@ -4391,6 +4424,22 @@ export type GetMetricMetadata200 = {
export type UpdateMetricMetadataPathParameters = {
metricName: string;
};
export type InspectMetrics200 = {
data: MetricsexplorertypesInspectMetricsResponseDTO;
/**
* @type string
*/
status: string;
};
export type GetMetricsOnboardingStatus200 = {
data: MetricsexplorertypesMetricsOnboardingResponseDTO;
/**
* @type string
*/
status: string;
};
export type GetMetricsStats200 = {
data: MetricsexplorertypesStatsResponseDTO;
/**

View File

@@ -1,54 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
export interface InspectMetricsRequest {
metricName: string;
start: number;
end: number;
filters: TagFilter;
}
export interface InspectMetricsResponse {
status: string;
data: {
series: InspectMetricsSeries[];
};
}
export interface InspectMetricsSeries {
title?: string;
strokeColor?: string;
labels: Record<string, string>;
labelsArray: Array<Record<string, string>>;
values: InspectMetricsTimestampValue[];
}
interface InspectMetricsTimestampValue {
timestamp: number;
value: string;
}
export const getInspectMetricsDetails = async (
request: InspectMetricsRequest,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<InspectMetricsResponse> | ErrorResponse> => {
try {
const response = await axios.post(`/metrics/inspect`, request, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: 'Success',
payload: response.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,75 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { MetricType } from './getMetricsList';
export interface MetricDetails {
name: string;
description: string;
type: string;
unit: string;
timeseries: number;
samples: number;
timeSeriesTotal: number;
timeSeriesActive: number;
lastReceived: string;
attributes: MetricDetailsAttribute[] | null;
metadata?: {
metric_type: MetricType;
description: string;
unit: string;
temporality?: Temporality;
};
alerts: MetricDetailsAlert[] | null;
dashboards: MetricDetailsDashboard[] | null;
}
export enum Temporality {
CUMULATIVE = 'Cumulative',
DELTA = 'Delta',
}
export interface MetricDetailsAttribute {
key: string;
value: string[];
valueCount: number;
}
export interface MetricDetailsAlert {
alert_name: string;
alert_id: string;
}
export interface MetricDetailsDashboard {
dashboard_name: string;
dashboard_id: string;
}
export interface MetricDetailsResponse {
status: string;
data: MetricDetails;
}
export const getMetricDetails = async (
metricName: string,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<MetricDetailsResponse> | ErrorResponse> => {
try {
const response = await axios.get(`/metrics/${metricName}/metadata`, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: 'Success',
payload: response.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,67 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import {
OrderByPayload,
TreemapViewType,
} from 'container/MetricsExplorer/Summary/types';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
export interface MetricsListPayload {
filters: TagFilter;
groupBy?: BaseAutocompleteData[];
offset?: number;
limit?: number;
orderBy?: OrderByPayload;
}
export enum MetricType {
SUM = 'Sum',
GAUGE = 'Gauge',
HISTOGRAM = 'Histogram',
SUMMARY = 'Summary',
EXPONENTIAL_HISTOGRAM = 'ExponentialHistogram',
}
export interface MetricsListItemData {
metric_name: string;
description: string;
type: MetricType;
unit: string;
[TreemapViewType.TIMESERIES]: number;
[TreemapViewType.SAMPLES]: number;
lastReceived: string;
}
export interface MetricsListResponse {
status: string;
data: {
metrics: MetricsListItemData[];
total?: number;
};
}
export const getMetricsList = async (
props: MetricsListPayload,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<MetricsListResponse> | ErrorResponse> => {
try {
const response = await axios.post('/metrics', props, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
params: props,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,44 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { BaseAutocompleteData } from 'types/api/queryBuilder/queryAutocompleteResponse';
export interface MetricsListFilterKeysResponse {
status: string;
data: {
metricColumns: string[];
attributeKeys: BaseAutocompleteData[];
};
}
export interface GetMetricsListFilterKeysParams {
searchText: string;
limit?: number;
}
export const getMetricsListFilterKeys = async (
params: GetMetricsListFilterKeysParams,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse> => {
try {
const response = await axios.get('/metrics/filters/keys', {
params: {
searchText: params.searchText,
limit: params.limit,
},
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,43 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
export interface MetricsListFilterValuesPayload {
filterAttributeKeyDataType: string;
filterKey: string;
searchText: string;
limit: number;
}
export interface MetricsListFilterValuesResponse {
status: string;
data: {
filterValues: string[];
};
}
export const getMetricsListFilterValues = async (
props: MetricsListFilterValuesPayload,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<
SuccessResponse<MetricsListFilterValuesResponse> | ErrorResponse
> => {
try {
const response = await axios.post('/metrics/filters/values', props, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
params: props,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,60 +0,0 @@
import axios from 'api';
import { ErrorResponseHandler } from 'api/ErrorResponseHandler';
import { AxiosError } from 'axios';
import { ErrorResponse, SuccessResponse } from 'types/api';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
export interface RelatedMetricsPayload {
start: number;
end: number;
currentMetricName: string;
}
export interface RelatedMetricDashboard {
dashboard_name: string;
dashboard_id: string;
widget_id: string;
widget_name: string;
}
export interface RelatedMetricAlert {
alert_name: string;
alert_id: string;
}
export interface RelatedMetric {
name: string;
query: IBuilderQuery;
dashboards: RelatedMetricDashboard[];
alerts: RelatedMetricAlert[];
}
export interface RelatedMetricsResponse {
status: 'success';
data: {
related_metrics: RelatedMetric[];
};
}
export const getRelatedMetrics = async (
props: RelatedMetricsPayload,
signal?: AbortSignal,
headers?: Record<string, string>,
): Promise<SuccessResponse<RelatedMetricsResponse> | ErrorResponse> => {
try {
const response = await axios.post('/metrics/related', props, {
signal,
headers,
});
return {
statusCode: 200,
error: null,
message: response.data.status,
payload: response.data,
params: props,
};
} catch (error) {
return ErrorResponseHandler(error as AxiosError);
}
};

View File

@@ -1,6 +1,7 @@
import { useCopyToClipboard } from 'react-use';
import { toast } from '@signozhq/sonner';
import { fireEvent, within } from '@testing-library/react';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import { StatusCodes } from 'http-status-codes';
import {
publishedPublicDashboardMeta,
@@ -33,7 +34,6 @@ const mockToast = jest.mocked(toast);
// Test constants
const MOCK_DASHBOARD_ID = 'test-dashboard-id';
const MOCK_PUBLIC_PATH = '/public/dashboard/test-dashboard-id';
const DEFAULT_TIME_RANGE = '30m';
const DASHBOARD_VARIABLES_WARNING =
"Dashboard variables won't work in public dashboards";

View File

@@ -7,6 +7,7 @@ import { Button, Select, Typography } from 'antd';
import createPublicDashboardAPI from 'api/dashboard/public/createPublicDashboard';
import revokePublicDashboardAccessAPI from 'api/dashboard/public/revokePublicDashboardAccess';
import updatePublicDashboardAPI from 'api/dashboard/public/updatePublicDashboard';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import { useGetPublicDashboardMeta } from 'hooks/dashboard/useGetPublicDashboardMeta';
import { useGetTenantLicense } from 'hooks/useGetTenantLicense';
import { Copy, ExternalLink, Globe, Info, Loader2, Trash } from 'lucide-react';
@@ -56,7 +57,7 @@ function PublicDashboardSetting(): JSX.Element {
PublicDashboardMetaProps | undefined
>(undefined);
const [timeRangeEnabled, setTimeRangeEnabled] = useState(true);
const [defaultTimeRange, setDefaultTimeRange] = useState('30m');
const [defaultTimeRange, setDefaultTimeRange] = useState(DEFAULT_TIME_RANGE);
const [, setCopyPublicDashboardURL] = useCopyToClipboard();
const { selectedDashboard } = useDashboardStore();
@@ -99,7 +100,7 @@ function PublicDashboardSetting(): JSX.Element {
console.error('Error getting public dashboard', errorPublicDashboard);
setPublicDashboardData(undefined);
setTimeRangeEnabled(true);
setDefaultTimeRange('30m');
setDefaultTimeRange(DEFAULT_TIME_RANGE);
}
}, [publicDashboardResponse, errorPublicDashboard]);
@@ -109,7 +110,7 @@ function PublicDashboardSetting(): JSX.Element {
publicDashboardResponse?.data?.timeRangeEnabled || false,
);
setDefaultTimeRange(
publicDashboardResponse?.data?.defaultTimeRange || '30m',
publicDashboardResponse?.data?.defaultTimeRange || DEFAULT_TIME_RANGE,
);
}
}, [publicDashboardResponse]);

View File

@@ -1,10 +1,11 @@
/* eslint-disable sonarjs/no-duplicate-string */
import React, { useCallback, useEffect, useMemo, useState } from 'react';
import React, { useCallback, useEffect, useState } from 'react';
import { useMutation, useQuery } from 'react-query';
import { Color } from '@signozhq/design-tokens';
import { Compass, Dot, House, Plus, Wrench } from '@signozhq/icons';
import { Button, Popover } from 'antd';
import logEvent from 'api/common/logEvent';
import { useGetMetricsOnboardingStatus } from 'api/generated/services/metrics';
import listUserPreferences from 'api/v1/user/preferences/list';
import updateUserPreferenceAPI from 'api/v1/user/preferences/name/update';
import { PersistedAnnouncementBanner } from 'components/AnnouncementBanner';
@@ -15,9 +16,8 @@ import { ORG_PREFERENCES } from 'constants/orgPreferences';
import { initialQueriesMap, PANEL_TYPES } from 'constants/queryBuilder';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import ROUTES from 'constants/routes';
import { getMetricsListQuery } from 'container/MetricsExplorer/Summary/utils';
import { IS_SERVICE_ACCOUNTS_ENABLED } from 'container/ServiceAccountsSettings/config';
import { useGetMetricsList } from 'hooks/metricsExplorer/useGetMetricsList';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import { useGetQueryRange } from 'hooks/queryBuilder/useGetQueryRange';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useSafeNavigate } from 'hooks/useSafeNavigate';
@@ -81,7 +81,7 @@ export default function Home(): JSX.Element {
query: initialQueriesMap[DataSource.LOGS],
graphType: PANEL_TYPES.VALUE,
selectedTime: 'GLOBAL_TIME',
globalSelectedInterval: '30m',
globalSelectedInterval: DEFAULT_TIME_RANGE,
params: {
dataSource: DataSource.LOGS,
},
@@ -91,7 +91,7 @@ export default function Home(): JSX.Element {
{
queryKey: [
REACT_QUERY_KEY.GET_QUERY_RANGE,
'30m',
DEFAULT_TIME_RANGE,
endTime || Date.now(),
startTime || Date.now(),
initialQueriesMap[DataSource.LOGS],
@@ -106,7 +106,7 @@ export default function Home(): JSX.Element {
query: initialQueriesMap[DataSource.TRACES],
graphType: PANEL_TYPES.VALUE,
selectedTime: 'GLOBAL_TIME',
globalSelectedInterval: '30m',
globalSelectedInterval: DEFAULT_TIME_RANGE,
params: {
dataSource: DataSource.TRACES,
},
@@ -116,7 +116,7 @@ export default function Home(): JSX.Element {
{
queryKey: [
REACT_QUERY_KEY.GET_QUERY_RANGE,
'30m',
DEFAULT_TIME_RANGE,
endTime || Date.now(),
startTime || Date.now(),
initialQueriesMap[DataSource.TRACES],
@@ -126,38 +126,7 @@ export default function Home(): JSX.Element {
);
// Detect Metrics
const query = useMemo(() => {
const baseQuery = getMetricsListQuery();
let queryStartTime = startTime;
let queryEndTime = endTime;
if (!startTime || !endTime) {
const now = new Date();
const startTime = new Date(now.getTime() - homeInterval);
const endTime = now;
queryStartTime = startTime.getTime();
queryEndTime = endTime.getTime();
}
return {
...baseQuery,
limit: 10,
offset: 0,
filters: {
items: [],
op: 'AND',
},
start: queryStartTime,
end: queryEndTime,
};
}, [startTime, endTime]);
const { data: metricsData } = useGetMetricsList(query, {
enabled: !!query,
queryKey: ['metricsList', query],
});
const { data: metricsOnboardingData } = useGetMetricsOnboardingStatus();
const [isLogsIngestionActive, setIsLogsIngestionActive] = useState(false);
const [isTracesIngestionActive, setIsTracesIngestionActive] = useState(false);
@@ -283,14 +252,12 @@ export default function Home(): JSX.Element {
}, [tracesData, handleUpdateChecklistDoneItem]);
useEffect(() => {
const metricsDataTotal = metricsData?.payload?.data?.total ?? 0;
if (metricsDataTotal > 0) {
if (metricsOnboardingData?.data?.hasMetrics) {
setIsMetricsIngestionActive(true);
handleUpdateChecklistDoneItem('ADD_DATA_SOURCE');
handleUpdateChecklistDoneItem('SEND_METRICS');
}
}, [metricsData, handleUpdateChecklistDoneItem]);
}, [metricsOnboardingData, handleUpdateChecklistDoneItem]);
useEffect(() => {
logEvent('Homepage: Visited', {});

View File

@@ -1,37 +0,0 @@
import { UseQueryResult } from 'react-query';
import { RelatedMetric } from 'api/metricsExplorer/getRelatedMetrics';
import { SuccessResponse } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
export enum ExplorerTabs {
TIME_SERIES = 'time-series',
RELATED_METRICS = 'related-metrics',
}
export interface TimeSeriesProps {
showOneChartPerQuery: boolean;
}
export interface RelatedMetricsProps {
metricNames: string[];
}
export interface RelatedMetricsCardProps {
metric: RelatedMetricWithQueryResult;
}
export interface UseGetRelatedMetricsGraphsProps {
selectedMetricName: string | null;
startMs: number;
endMs: number;
}
export interface UseGetRelatedMetricsGraphsReturn {
relatedMetrics: RelatedMetricWithQueryResult[];
isRelatedMetricsLoading: boolean;
isRelatedMetricsError: boolean;
}
export interface RelatedMetricWithQueryResult extends RelatedMetric {
queryResult: UseQueryResult<SuccessResponse<MetricRangePayloadProps>, unknown>;
}

View File

@@ -30,31 +30,6 @@
}
}
.explore-tabs {
margin: 15px 0;
.tab {
background-color: var(--bg-slate-500);
border-color: var(--bg-ink-200);
width: 180px;
padding: 16px 0;
display: flex;
justify-content: center;
align-items: center;
}
.tab:first-of-type {
border-top-left-radius: 2px;
}
.tab:last-of-type {
border-top-right-radius: 2px;
}
.selected-view {
background: var(--bg-ink-500);
}
}
.explore-content {
padding: 0 8px;
@@ -116,81 +91,6 @@
width: 100%;
height: fit-content;
}
.related-metrics-container {
width: 100%;
min-height: 300px;
display: flex;
flex-direction: column;
gap: 10px;
.related-metrics-header {
display: flex;
align-items: center;
justify-content: flex-start;
.metric-name-select {
width: 20%;
margin-right: 10px;
}
.related-metrics-input {
width: 40%;
.ant-input-wrapper {
.ant-input-group-addon {
.related-metrics-select {
width: 250px;
border: 1px solid var(--bg-slate-500) !important;
.ant-select-selector {
text-align: left;
color: var(--text-vanilla-500) !important;
}
}
}
}
}
}
.related-metrics-body {
margin-top: 20px;
max-height: 650px;
overflow-y: scroll;
.related-metrics-card-container {
margin-bottom: 20px;
min-height: 640px;
.related-metrics-card {
display: flex;
flex-direction: column;
gap: 16px;
.related-metrics-card-error {
padding-top: 10px;
height: fit-content;
width: fit-content;
}
}
}
}
}
}
}
.lightMode {
.metrics-explorer-explore-container {
.explore-tabs {
.tab {
background-color: var(--bg-vanilla-100);
border-color: var(--bg-vanilla-400);
}
.selected-view {
background: var(--bg-vanilla-500);
}
}
}
}

View File

@@ -32,7 +32,6 @@ import { v4 as uuid } from 'uuid';
import { MetricsExplorerEventKeys, MetricsExplorerEvents } from '../events';
import MetricDetails from '../MetricDetails/MetricDetails';
import TimeSeries from './TimeSeries';
import { ExplorerTabs } from './types';
import {
getMetricUnits,
splitQueryIntoOneChartPerQuery,
@@ -95,7 +94,6 @@ function Explorer(): JSX.Element {
const [disableOneChartPerQuery, toggleDisableOneChartPerQuery] = useState(
false,
);
const [selectedTab] = useState<ExplorerTabs>(ExplorerTabs.TIME_SERIES);
const [yAxisUnit, setYAxisUnit] = useState<string | undefined>();
const unitsLength = useMemo(() => units.length, [units]);
@@ -319,48 +317,21 @@ function Explorer(): JSX.Element {
showFunctions={false}
version="v3"
/>
{/* TODO: Enable once we have resolved all related metrics issues */}
{/* <Button.Group className="explore-tabs">
<Button
value={ExplorerTabs.TIME_SERIES}
className={classNames('tab', {
'selected-view': selectedTab === ExplorerTabs.TIME_SERIES,
})}
onClick={(): void => setSelectedTab(ExplorerTabs.TIME_SERIES)}
>
<Typography.Text>Time series</Typography.Text>
</Button>
<Button
value={ExplorerTabs.RELATED_METRICS}
className={classNames('tab', {
'selected-view': selectedTab === ExplorerTabs.RELATED_METRICS,
})}
onClick={(): void => setSelectedTab(ExplorerTabs.RELATED_METRICS)}
>
<Typography.Text>Related</Typography.Text>
</Button>
</Button.Group> */}
<div className="explore-content">
{selectedTab === ExplorerTabs.TIME_SERIES && (
<TimeSeries
showOneChartPerQuery={showOneChartPerQuery}
setWarning={setWarning}
areAllMetricUnitsSame={areAllMetricUnitsSame}
isMetricUnitsLoading={isMetricUnitsLoading}
isMetricUnitsError={isMetricUnitsError}
metricUnits={units}
metricNames={metricNames}
metrics={metrics}
handleOpenMetricDetails={handleOpenMetricDetails}
yAxisUnit={yAxisUnit}
setYAxisUnit={setYAxisUnit}
showYAxisUnitSelector={showYAxisUnitSelector}
/>
)}
{/* TODO: Enable once we have resolved all related metrics issues */}
{/* {selectedTab === ExplorerTabs.RELATED_METRICS && (
<RelatedMetrics metricNames={metricNames} />
)} */}
<TimeSeries
showOneChartPerQuery={showOneChartPerQuery}
setWarning={setWarning}
areAllMetricUnitsSame={areAllMetricUnitsSame}
isMetricUnitsLoading={isMetricUnitsLoading}
isMetricUnitsError={isMetricUnitsError}
metricUnits={units}
metricNames={metricNames}
metrics={metrics}
handleOpenMetricDetails={handleOpenMetricDetails}
yAxisUnit={yAxisUnit}
setYAxisUnit={setYAxisUnit}
showYAxisUnitSelector={showYAxisUnitSelector}
/>
</div>
</div>
<ExplorerOptionWrapper

View File

@@ -1,153 +0,0 @@
import { useEffect, useMemo, useRef, useState } from 'react';
// eslint-disable-next-line no-restricted-imports
import { useSelector } from 'react-redux';
import { Color } from '@signozhq/design-tokens';
import { Card, Col, Empty, Input, Row, Select, Skeleton } from 'antd';
import { Gauge } from 'lucide-react';
import { AppState } from 'store/reducers';
import { GlobalReducer } from 'types/reducer/globalTime';
import RelatedMetricsCard from './RelatedMetricsCard';
import { RelatedMetricsProps, RelatedMetricWithQueryResult } from './types';
import { useGetRelatedMetricsGraphs } from './useGetRelatedMetricsGraphs';
function RelatedMetrics({ metricNames }: RelatedMetricsProps): JSX.Element {
const graphRef = useRef<HTMLDivElement>(null);
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
const [selectedMetricName, setSelectedMetricName] = useState<string | null>(
null,
);
const [selectedRelatedMetric, setSelectedRelatedMetric] = useState('all');
const [searchValue, setSearchValue] = useState<string | null>(null);
const startMs = useMemo(() => Math.floor(Number(minTime) / 1000000000), [
minTime,
]);
const endMs = useMemo(() => Math.floor(Number(maxTime) / 1000000000), [
maxTime,
]);
useEffect(() => {
if (metricNames.length) {
setSelectedMetricName(metricNames[0]);
}
}, [metricNames]);
const {
relatedMetrics,
isRelatedMetricsLoading,
isRelatedMetricsError,
} = useGetRelatedMetricsGraphs({
selectedMetricName,
startMs,
endMs,
});
const metricNamesSelectOptions = useMemo(
() =>
metricNames.map((name) => ({
value: name,
label: name,
})),
[metricNames],
);
const relatedMetricsSelectOptions = useMemo(() => {
const options: { value: string; label: string }[] = [
{
value: 'all',
label: 'All',
},
];
relatedMetrics.forEach((metric) => {
options.push({
value: metric.name,
label: metric.name,
});
});
return options;
}, [relatedMetrics]);
const filteredRelatedMetrics = useMemo(() => {
let filteredMetrics: RelatedMetricWithQueryResult[] = [];
if (selectedRelatedMetric === 'all') {
filteredMetrics = [...relatedMetrics];
} else {
filteredMetrics = relatedMetrics.filter(
(metric) => metric.name === selectedRelatedMetric,
);
}
if (searchValue?.length) {
filteredMetrics = filteredMetrics.filter((metric) =>
metric.name.toLowerCase().includes(searchValue?.toLowerCase() ?? ''),
);
}
return filteredMetrics;
}, [relatedMetrics, selectedRelatedMetric, searchValue]);
return (
<div className="related-metrics-container">
<div className="related-metrics-header">
<Select
className="metric-name-select"
value={selectedMetricName}
options={metricNamesSelectOptions}
onChange={(value): void => setSelectedMetricName(value)}
suffixIcon={<Gauge size={12} color={Color.BG_SAKURA_500} />}
/>
<Input
className="related-metrics-input"
placeholder="Search..."
onChange={(e): void => setSearchValue(e.target.value)}
bordered
addonBefore={
<Select
loading={isRelatedMetricsLoading}
value={selectedRelatedMetric}
className="related-metrics-select"
options={relatedMetricsSelectOptions}
onChange={(value): void => setSelectedRelatedMetric(value)}
bordered={false}
/>
}
/>
</div>
<div className="related-metrics-body">
{isRelatedMetricsLoading && <Skeleton active />}
{isRelatedMetricsError && (
<Empty description="Error fetching related metrics" />
)}
{!isRelatedMetricsLoading &&
!isRelatedMetricsError &&
filteredRelatedMetrics.length === 0 && (
<Empty description="No related metrics found" />
)}
{!isRelatedMetricsLoading &&
!isRelatedMetricsError &&
filteredRelatedMetrics.length > 0 && (
<Row gutter={24}>
{filteredRelatedMetrics.map((relatedMetricWithQueryResult) => (
<Col span={12} key={relatedMetricWithQueryResult.name}>
<Card
bordered
ref={graphRef}
className="related-metrics-card-container"
>
<RelatedMetricsCard
key={relatedMetricWithQueryResult.name}
metric={relatedMetricWithQueryResult}
/>
</Card>
</Col>
))}
</Row>
)}
</div>
</div>
);
}
export default RelatedMetrics;

View File

@@ -1,47 +0,0 @@
import { Empty, Skeleton, Typography } from 'antd';
import TimeSeriesView from 'container/TimeSeriesView/TimeSeriesView';
import { DataSource } from 'types/common/queryBuilder';
import DashboardsAndAlertsPopover from '../MetricDetails/DashboardsAndAlertsPopover';
import { RelatedMetricsCardProps } from './types';
function RelatedMetricsCard({ metric }: RelatedMetricsCardProps): JSX.Element {
const { queryResult } = metric;
if (queryResult.isLoading) {
return <Skeleton />;
}
if (queryResult.error) {
const errorMessage =
(queryResult.error as Error)?.message || 'Something went wrong';
return <div>{errorMessage}</div>;
}
return (
<div className="related-metrics-card">
<Typography.Text className="related-metrics-card-name">
{metric.name}
</Typography.Text>
{queryResult.isLoading ? <Skeleton /> : null}
{queryResult.isError ? (
<div className="related-metrics-card-error">
<Empty description="Error fetching metric data" />
</div>
) : null}
{!queryResult.isLoading && !queryResult.error && (
<TimeSeriesView
isFilterApplied={false}
isError={queryResult.isError}
isLoading={queryResult.isLoading}
data={queryResult.data}
yAxisUnit="ms"
dataSource={DataSource.METRICS}
/>
)}
<DashboardsAndAlertsPopover metricName={metric.name} />
</div>
);
}
export default RelatedMetricsCard;

View File

@@ -1,14 +1,6 @@
import { Dispatch, SetStateAction } from 'react';
import { UseQueryResult } from 'react-query';
import { MetricsexplorertypesMetricMetadataDTO } from 'api/generated/services/sigNoz.schemas';
import { RelatedMetric } from 'api/metricsExplorer/getRelatedMetrics';
import { SuccessResponse, Warning } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
export enum ExplorerTabs {
TIME_SERIES = 'time-series',
RELATED_METRICS = 'related-metrics',
}
import { Warning } from 'types/api';
export interface TimeSeriesProps {
showOneChartPerQuery: boolean;
@@ -24,27 +16,3 @@ export interface TimeSeriesProps {
setYAxisUnit: (unit: string) => void;
showYAxisUnitSelector: boolean;
}
export interface RelatedMetricsProps {
metricNames: string[];
}
export interface RelatedMetricsCardProps {
metric: RelatedMetricWithQueryResult;
}
export interface UseGetRelatedMetricsGraphsProps {
selectedMetricName: string | null;
startMs: number;
endMs: number;
}
export interface UseGetRelatedMetricsGraphsReturn {
relatedMetrics: RelatedMetricWithQueryResult[];
isRelatedMetricsLoading: boolean;
isRelatedMetricsError: boolean;
}
export interface RelatedMetricWithQueryResult extends RelatedMetric {
queryResult: UseQueryResult<SuccessResponse<MetricRangePayloadProps>, unknown>;
}

View File

@@ -1,113 +0,0 @@
import { useMemo } from 'react';
import { useQueries } from 'react-query';
// eslint-disable-next-line no-restricted-imports
import { useSelector } from 'react-redux';
import { ENTITY_VERSION_V4 } from 'constants/app';
import { PANEL_TYPES } from 'constants/queryBuilder';
import { useGetRelatedMetrics } from 'hooks/metricsExplorer/useGetRelatedMetrics';
import { GetMetricQueryRange } from 'lib/dashboard/getQueryResults';
import { AppState } from 'store/reducers';
import { SuccessResponse } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import { EQueryType } from 'types/common/dashboard';
import { DataSource } from 'types/common/queryBuilder';
import { GlobalReducer } from 'types/reducer/globalTime';
import { v4 as uuidv4 } from 'uuid';
import { convertNanoToMilliseconds } from '../Summary/utils';
import {
UseGetRelatedMetricsGraphsProps,
UseGetRelatedMetricsGraphsReturn,
} from './types';
export const useGetRelatedMetricsGraphs = ({
selectedMetricName,
startMs,
endMs,
}: UseGetRelatedMetricsGraphsProps): UseGetRelatedMetricsGraphsReturn => {
const { maxTime, minTime, selectedTime: globalSelectedTime } = useSelector<
AppState,
GlobalReducer
>((state) => state.globalTime);
// Build the query for the related metrics
const relatedMetricsQuery = useMemo(
() => ({
start: convertNanoToMilliseconds(minTime),
end: convertNanoToMilliseconds(maxTime),
currentMetricName: selectedMetricName ?? '',
}),
[selectedMetricName, minTime, maxTime],
);
// Get the related metrics
const {
data: relatedMetricsData,
isLoading: isRelatedMetricsLoading,
isError: isRelatedMetricsError,
} = useGetRelatedMetrics(relatedMetricsQuery, {
enabled: !!selectedMetricName,
});
// Build the related metrics array
const relatedMetrics = useMemo(() => {
if (relatedMetricsData?.payload?.data?.related_metrics) {
return relatedMetricsData.payload.data.related_metrics;
}
return [];
}, [relatedMetricsData]);
// Build the query results for the related metrics
const relatedMetricsQueryResults = useQueries(
useMemo(
() =>
relatedMetrics.map((metric) => ({
queryKey: ['related-metrics', metric.name],
queryFn: (): Promise<SuccessResponse<MetricRangePayloadProps>> =>
GetMetricQueryRange(
{
query: {
queryType: EQueryType.QUERY_BUILDER,
promql: [],
builder: {
queryData: [metric.query],
queryFormulas: [],
queryTraceOperator: [],
},
clickhouse_sql: [],
id: uuidv4(),
},
graphType: PANEL_TYPES.TIME_SERIES,
selectedTime: 'GLOBAL_TIME',
globalSelectedInterval: globalSelectedTime,
start: startMs,
end: endMs,
formatForWeb: false,
params: {
dataSource: DataSource.METRICS,
},
},
ENTITY_VERSION_V4,
),
enabled: !!metric.query,
})),
[relatedMetrics, globalSelectedTime, startMs, endMs],
),
);
// Build the related metrics with query results
const relatedMetricsWithQueryResults = useMemo(
() =>
relatedMetrics.map((metric, index) => ({
...metric,
queryResult: relatedMetricsQueryResults[index],
})),
[relatedMetrics, relatedMetricsQueryResults],
);
return {
relatedMetrics: relatedMetricsWithQueryResults,
isRelatedMetricsLoading,
isRelatedMetricsError,
};
};

View File

@@ -4,7 +4,6 @@ import { Color } from '@signozhq/design-tokens';
import type { TableColumnsType as ColumnsType } from 'antd';
import { Card, Tooltip, Typography } from 'antd';
import logEvent from 'api/common/logEvent';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import classNames from 'classnames';
import ResizeTable from 'components/ResizeTable/ResizeTable';
import { DataType } from 'container/LogDetailedView/TableView';
@@ -15,6 +14,7 @@ import {
SPACE_AGGREGATION_OPTIONS_FOR_EXPANDED_VIEW,
TIME_AGGREGATION_OPTIONS,
} from './constants';
import { InspectMetricsSeries } from './types';
import {
ExpandedViewProps,
InspectionStep,
@@ -42,7 +42,8 @@ function ExpandedView({
useEffect(() => {
logEvent(MetricsExplorerEvents.InspectPointClicked, {
[MetricsExplorerEventKeys.Modal]: 'inspect',
[MetricsExplorerEventKeys.Filters]: metricInspectionAppliedOptions.filters,
[MetricsExplorerEventKeys.Filters]:
metricInspectionAppliedOptions.filterExpression,
[MetricsExplorerEventKeys.TimeAggregationInterval]:
metricInspectionAppliedOptions.timeAggregationInterval,
[MetricsExplorerEventKeys.TimeAggregationOption]:

View File

@@ -3,7 +3,7 @@ import * as Sentry from '@sentry/react';
import { Color } from '@signozhq/design-tokens';
import { Button, Drawer, Empty, Skeleton, Typography } from 'antd';
import logEvent from 'api/common/logEvent';
import { useGetMetricDetails } from 'hooks/metricsExplorer/useGetMetricDetails';
import { useGetMetricMetadata } from 'api/generated/services/metrics';
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
import { useQueryOperations } from 'hooks/queryBuilder/useQueryBuilderOperations';
import { useIsDarkMode } from 'hooks/useDarkMode';
@@ -48,10 +48,12 @@ function Inspect({
] = useState<GraphPopoverOptions | null>(null);
const [showExpandedView, setShowExpandedView] = useState(false);
const { data: metricDetailsData } = useGetMetricDetails(
appliedMetricName ?? '',
const { data: metricDetailsData } = useGetMetricMetadata(
{ metricName: appliedMetricName ?? '' },
{
enabled: !!appliedMetricName,
query: {
enabled: !!appliedMetricName,
},
},
);
@@ -93,7 +95,6 @@ function Inspect({
const {
inspectMetricsTimeSeries,
inspectMetricsStatusCode,
isInspectMetricsLoading,
isInspectMetricsError,
formattedInspectMetricsTimeSeries,
@@ -118,15 +119,13 @@ function Inspect({
[dispatchMetricInspectionOptions],
);
const selectedMetricType = useMemo(
() => metricDetailsData?.payload?.data?.metadata?.metric_type,
[metricDetailsData],
);
const selectedMetricType = useMemo(() => metricDetailsData?.data?.type, [
metricDetailsData,
]);
const selectedMetricUnit = useMemo(
() => metricDetailsData?.payload?.data?.metadata?.unit,
[metricDetailsData],
);
const selectedMetricUnit = useMemo(() => metricDetailsData?.data?.unit, [
metricDetailsData,
]);
const aggregateAttribute = useMemo(
() => ({
@@ -180,11 +179,8 @@ function Inspect({
);
}
if (isInspectMetricsError || inspectMetricsStatusCode !== 200) {
const errorMessage =
inspectMetricsStatusCode === 400
? 'The time range is too large. Please modify it to be within 30 minutes.'
: 'Error loading inspect metrics.';
if (isInspectMetricsError) {
const errorMessage = 'Error loading inspect metrics.';
return (
<div
@@ -261,7 +257,6 @@ function Inspect({
isInspectMetricsLoading,
isInspectMetricsRefetching,
isInspectMetricsError,
inspectMetricsStatusCode,
inspectMetricsTimeSeries,
aggregatedTimeSeries,
formattedInspectMetricsTimeSeries,

View File

@@ -33,7 +33,7 @@ function MetricFilters({
});
dispatchMetricInspectionOptions({
type: 'SET_FILTERS',
payload: tagFilter,
payload: expression,
});
},
[currentQuery, dispatchMetricInspectionOptions, setCurrentQuery],

View File

@@ -1,8 +1,8 @@
import { useCallback, useMemo } from 'react';
import type { TableColumnsType as ColumnsType } from 'antd';
import { Card, Flex, Table, Typography } from 'antd';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import { InspectMetricsSeries } from './types';
import { TableViewProps } from './types';
import { formatTimestampToFullDateTime } from './utils';

View File

@@ -1,11 +1,11 @@
import { render, screen } from '@testing-library/react';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import {
SPACE_AGGREGATION_OPTIONS_FOR_EXPANDED_VIEW,
TIME_AGGREGATION_OPTIONS,
} from '../constants';
import ExpandedView from '../ExpandedView';
import { InspectMetricsSeries } from '../types';
import {
GraphPopoverData,
InspectionStep,
@@ -25,7 +25,6 @@ describe('ExpandedView', () => {
labels: {
host_id: 'test-id',
},
labelsArray: [],
title: 'TS1',
};
@@ -66,10 +65,7 @@ describe('ExpandedView', () => {
timeAggregationInterval: 60,
spaceAggregationOption: SpaceAggregationOptions.MAX_BY,
spaceAggregationLabels: ['host_name'],
filters: {
items: [],
op: 'AND',
},
filterExpression: '',
};
it('renders entire time series for a raw data inspection', () => {

View File

@@ -1,7 +1,7 @@
import { fireEvent, render, screen } from '@testing-library/react';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import GraphPopover from '../GraphPopover';
import { InspectMetricsSeries } from '../types';
import { GraphPopoverOptions, InspectionStep } from '../types';
describe('GraphPopover', () => {
@@ -16,7 +16,6 @@ describe('GraphPopover', () => {
{ timestamp: 1672531260000, value: '43.456' },
],
labels: {},
labelsArray: [],
},
};
const mockSpaceAggregationSeriesMap: Map<

View File

@@ -2,12 +2,12 @@
import { Provider } from 'react-redux';
import { render, screen } from '@testing-library/react';
import userEvent from '@testing-library/user-event';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import store from 'store';
import { AlignedData } from 'uplot';
import GraphView from '../GraphView';
import { InspectMetricsSeries } from '../types';
import {
InspectionStep,
SpaceAggregationOptions,
@@ -32,7 +32,6 @@ describe('GraphView', () => {
{ timestamp: 1234567891000, value: '20' },
],
labels: { label1: 'value1' },
labelsArray: [{ label: 'label1', value: 'value1' }],
},
];
@@ -44,7 +43,7 @@ describe('GraphView', () => {
] as AlignedData,
metricUnit: '',
metricName: 'test_metric',
metricType: MetricType.GAUGE,
metricType: MetrictypesTypeDTO.gauge,
spaceAggregationSeriesMap: new Map(),
inspectionStep: InspectionStep.COMPLETED,
setPopoverOptions: jest.fn(),
@@ -58,10 +57,7 @@ describe('GraphView', () => {
spaceAggregationOption: SpaceAggregationOptions.MAX_BY,
spaceAggregationLabels: ['host_name'],
timeAggregationOption: TimeAggregationOptions.MAX,
filters: {
items: [],
op: 'AND',
},
filterExpression: '',
},
isInspectMetricsRefetching: false,
};

View File

@@ -2,17 +2,21 @@ import { QueryClient, QueryClientProvider } from 'react-query';
// eslint-disable-next-line no-restricted-imports
import { Provider } from 'react-redux';
import { render, screen } from '@testing-library/react';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import * as useInspectMetricsHooks from 'hooks/metricsExplorer/useGetInspectMetricsDetails';
import * as useGetMetricDetailsHooks from 'hooks/metricsExplorer/useGetMetricDetails';
import * as metricsGeneratedAPI from 'api/generated/services/metrics';
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import * as appContextHooks from 'providers/App/App';
import store from 'store';
import ROUTES from '../../../../constants/routes';
import { LicenseEvent } from '../../../../types/api/licensesV3/getActive';
import { INITIAL_INSPECT_METRICS_OPTIONS } from '../constants';
import Inspect from '../Inspect';
import { InspectionStep } from '../types';
import {
InspectionStep,
InspectMetricsSeries,
UseInspectMetricsReturnData,
} from '../types';
import * as useInspectMetricsModule from '../useInspectMetrics';
const queryClient = new QueryClient();
const mockTimeSeries: InspectMetricsSeries[] = [
@@ -24,7 +28,6 @@ const mockTimeSeries: InspectMetricsSeries[] = [
{ timestamp: 1234567891000, value: '20' },
],
labels: { label1: 'value1' },
labelsArray: [{ label: 'label1', value: 'value1' }],
},
];
@@ -52,29 +55,19 @@ jest.spyOn(appContextHooks, 'useAppContext').mockReturnValue({
},
} as any);
jest.spyOn(useGetMetricDetailsHooks, 'useGetMetricDetails').mockReturnValue({
jest.spyOn(metricsGeneratedAPI, 'useGetMetricMetadata').mockReturnValue({
data: {
metricDetails: {
metricName: 'test_metric',
metricType: MetricType.GAUGE,
data: {
type: MetrictypesTypeDTO.gauge,
unit: '',
description: '',
temporality: '',
isMonotonic: false,
},
status: 'success',
},
} as any);
jest
.spyOn(useInspectMetricsHooks, 'useGetInspectMetricsDetails')
.mockReturnValue({
data: {
payload: {
data: {
series: mockTimeSeries,
},
status: 'success',
},
},
isLoading: false,
} as any);
jest.mock('react-router-dom', () => ({
...jest.requireActual('react-router-dom'),
useLocation: (): { pathname: string } => ({
@@ -90,16 +83,25 @@ mockResizeObserver.mockImplementation(() => ({
}));
window.ResizeObserver = mockResizeObserver;
const baseHookReturn: UseInspectMetricsReturnData = {
inspectMetricsTimeSeries: [],
isInspectMetricsLoading: false,
isInspectMetricsError: false,
formattedInspectMetricsTimeSeries: [[], []],
spaceAggregationLabels: [],
metricInspectionOptions: INITIAL_INSPECT_METRICS_OPTIONS,
dispatchMetricInspectionOptions: jest.fn(),
inspectionStep: InspectionStep.COMPLETED,
isInspectMetricsRefetching: false,
spaceAggregatedSeriesMap: new Map(),
aggregatedTimeSeries: [],
timeAggregatedSeriesMap: new Map(),
reset: jest.fn(),
};
describe('Inspect', () => {
const defaultProps = {
inspectMetricsTimeSeries: mockTimeSeries,
formattedInspectMetricsTimeSeries: [],
metricUnit: '',
metricName: 'test_metric',
metricType: MetricType.GAUGE,
spaceAggregationSeriesMap: new Map(),
inspectionStep: InspectionStep.COMPLETED,
resetInspection: jest.fn(),
isOpen: true,
onClose: jest.fn(),
};
@@ -109,6 +111,12 @@ describe('Inspect', () => {
});
it('renders all components', () => {
jest.spyOn(useInspectMetricsModule, 'useInspectMetrics').mockReturnValue({
...baseHookReturn,
inspectMetricsTimeSeries: mockTimeSeries,
aggregatedTimeSeries: mockTimeSeries,
});
render(
<QueryClientProvider client={queryClient}>
<Provider store={store}>
@@ -123,18 +131,11 @@ describe('Inspect', () => {
});
it('renders loading state', () => {
jest
.spyOn(useInspectMetricsHooks, 'useGetInspectMetricsDetails')
.mockReturnValue({
data: {
payload: {
data: {
series: [],
},
},
},
isLoading: true,
} as any);
jest.spyOn(useInspectMetricsModule, 'useInspectMetrics').mockReturnValue({
...baseHookReturn,
isInspectMetricsLoading: true,
});
render(
<QueryClientProvider client={queryClient}>
<Provider store={store}>
@@ -147,18 +148,11 @@ describe('Inspect', () => {
});
it('renders empty state', () => {
jest
.spyOn(useInspectMetricsHooks, 'useGetInspectMetricsDetails')
.mockReturnValue({
data: {
payload: {
data: {
series: [],
},
},
},
isLoading: false,
} as any);
jest.spyOn(useInspectMetricsModule, 'useInspectMetrics').mockReturnValue({
...baseHookReturn,
inspectMetricsTimeSeries: [],
});
render(
<QueryClientProvider client={queryClient}>
<Provider store={store}>
@@ -171,39 +165,11 @@ describe('Inspect', () => {
});
it('renders error state', () => {
jest
.spyOn(useInspectMetricsHooks, 'useGetInspectMetricsDetails')
.mockReturnValue({
data: {
payload: {
data: {
series: [],
},
},
},
isLoading: false,
isError: true,
} as any);
render(
<QueryClientProvider client={queryClient}>
<Provider store={store}>
<Inspect {...defaultProps} />
</Provider>
</QueryClientProvider>,
);
jest.spyOn(useInspectMetricsModule, 'useInspectMetrics').mockReturnValue({
...baseHookReturn,
isInspectMetricsError: true,
});
expect(screen.getByTestId('inspect-metrics-error')).toBeInTheDocument();
});
it('renders error state with 400 status code', () => {
jest
.spyOn(useInspectMetricsHooks, 'useGetInspectMetricsDetails')
.mockReturnValue({
data: {
statusCode: 400,
},
isError: false,
} as any);
render(
<QueryClientProvider client={queryClient}>
<Provider store={store}>

View File

@@ -4,7 +4,7 @@ import { Provider } from 'react-redux';
import { fireEvent, render, screen, within } from '@testing-library/react';
import userEvent from '@testing-library/user-event';
import * as metricsService from 'api/generated/services/metrics';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import * as appContextHooks from 'providers/App/App';
import store from 'store';
@@ -89,13 +89,10 @@ describe('QueryBuilder', () => {
timeAggregationOption: TimeAggregationOptions.AVG,
spaceAggregationLabels: [],
spaceAggregationOption: SpaceAggregationOptions.AVG_BY,
filters: {
items: [],
op: 'and',
},
filterExpression: '',
},
dispatchMetricInspectionOptions: jest.fn(),
metricType: MetricType.SUM,
metricType: MetrictypesTypeDTO.sum,
inspectionStep: InspectionStep.TIME_AGGREGATION,
inspectMetricsTimeSeries: [],
currentQuery: {
@@ -103,6 +100,7 @@ describe('QueryBuilder', () => {
items: [],
op: 'and',
},
filterExpression: '',
} as any,
setCurrentQuery: jest.fn(),
};

View File

@@ -1,7 +1,7 @@
import { render, screen } from '@testing-library/react';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import TableView from '../TableView';
import { InspectMetricsSeries } from '../types';
import {
InspectionStep,
SpaceAggregationOptions,
@@ -19,12 +19,6 @@ describe('TableView', () => {
{ timestamp: 1234567891000, value: '20' },
],
labels: { label1: 'value1' },
labelsArray: [
{
label: 'label1',
value: 'value1',
},
],
},
{
strokeColor: '#fff',
@@ -34,12 +28,6 @@ describe('TableView', () => {
{ timestamp: 1234567891000, value: '40' },
],
labels: { label2: 'value2' },
labelsArray: [
{
label: 'label2',
value: 'value2',
},
],
},
];
@@ -53,10 +41,7 @@ describe('TableView', () => {
timeAggregationOption: TimeAggregationOptions.MAX,
spaceAggregationOption: SpaceAggregationOptions.MAX_BY,
spaceAggregationLabels: ['host_name'],
filters: {
items: [],
op: 'AND',
},
filterExpression: '',
},
isInspectMetricsRefetching: false,
};

View File

@@ -1,6 +1,6 @@
import { ForwardRefExoticComponent, RefAttributes } from 'react';
import { Color } from '@signozhq/design-tokens';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import {
BarChart,
BarChart2,
@@ -18,25 +18,25 @@ import {
export const INSPECT_FEATURE_FLAG_KEY = 'metrics-explorer-inspect-feature-flag';
export const METRIC_TYPE_TO_COLOR_MAP: Record<MetricType, string> = {
[MetricType.GAUGE]: Color.BG_SAKURA_500,
[MetricType.HISTOGRAM]: Color.BG_SIENNA_500,
[MetricType.SUM]: Color.BG_ROBIN_500,
[MetricType.SUMMARY]: Color.BG_FOREST_500,
[MetricType.EXPONENTIAL_HISTOGRAM]: Color.BG_AQUA_500,
export const METRIC_TYPE_TO_COLOR_MAP: Record<MetrictypesTypeDTO, string> = {
[MetrictypesTypeDTO.gauge]: Color.BG_SAKURA_500,
[MetrictypesTypeDTO.histogram]: Color.BG_SIENNA_500,
[MetrictypesTypeDTO.sum]: Color.BG_ROBIN_500,
[MetrictypesTypeDTO.summary]: Color.BG_FOREST_500,
[MetrictypesTypeDTO.exponentialhistogram]: Color.BG_AQUA_500,
};
export const METRIC_TYPE_TO_ICON_MAP: Record<
MetricType,
MetrictypesTypeDTO,
ForwardRefExoticComponent<
Omit<LucideProps, 'ref'> & RefAttributes<SVGSVGElement>
>
> = {
[MetricType.GAUGE]: Gauge,
[MetricType.HISTOGRAM]: BarChart2,
[MetricType.SUM]: Diff,
[MetricType.SUMMARY]: BarChartHorizontal,
[MetricType.EXPONENTIAL_HISTOGRAM]: BarChart,
[MetrictypesTypeDTO.gauge]: Gauge,
[MetrictypesTypeDTO.histogram]: BarChart2,
[MetrictypesTypeDTO.sum]: Diff,
[MetrictypesTypeDTO.summary]: BarChartHorizontal,
[MetrictypesTypeDTO.exponentialhistogram]: BarChart,
};
export const TIME_AGGREGATION_OPTIONS: Record<
@@ -77,20 +77,14 @@ export const INITIAL_INSPECT_METRICS_OPTIONS: MetricInspectionState = {
timeAggregationInterval: undefined,
spaceAggregationOption: undefined,
spaceAggregationLabels: [],
filters: {
items: [],
op: 'AND',
},
filterExpression: '',
},
appliedOptions: {
timeAggregationOption: undefined,
timeAggregationInterval: undefined,
spaceAggregationOption: undefined,
spaceAggregationLabels: [],
filters: {
items: [],
op: 'AND',
},
filterExpression: '',
},
};

View File

@@ -1,11 +1,19 @@
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import {
IBuilderQuery,
TagFilter,
} from 'types/api/queryBuilder/queryBuilderData';
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { AlignedData } from 'uplot';
export interface InspectMetricsTimestampValue {
timestamp: number;
value: string;
}
export interface InspectMetricsSeries {
title?: string;
strokeColor?: string;
labels: Record<string, string>;
values: InspectMetricsTimestampValue[];
}
export type InspectProps = {
metricName: string;
isOpen: boolean;
@@ -14,7 +22,6 @@ export type InspectProps = {
export interface UseInspectMetricsReturnData {
inspectMetricsTimeSeries: InspectMetricsSeries[];
inspectMetricsStatusCode: number;
isInspectMetricsLoading: boolean;
isInspectMetricsError: boolean;
formattedInspectMetricsTimeSeries: AlignedData;
@@ -33,7 +40,7 @@ export interface GraphViewProps {
inspectMetricsTimeSeries: InspectMetricsSeries[];
metricUnit: string | undefined;
metricName: string | null;
metricType?: MetricType | undefined;
metricType?: MetrictypesTypeDTO | undefined;
formattedInspectMetricsTimeSeries: AlignedData;
resetInspection: () => void;
spaceAggregationSeriesMap: Map<string, InspectMetricsSeries[]>;
@@ -106,7 +113,7 @@ export interface MetricInspectionOptions {
timeAggregationInterval: number | undefined;
spaceAggregationOption: SpaceAggregationOptions | undefined;
spaceAggregationLabels: string[];
filters: TagFilter;
filterExpression: string;
}
export interface MetricInspectionState {
@@ -119,7 +126,7 @@ export type MetricInspectionAction =
| { type: 'SET_TIME_AGGREGATION_INTERVAL'; payload: number }
| { type: 'SET_SPACE_AGGREGATION_OPTION'; payload: SpaceAggregationOptions }
| { type: 'SET_SPACE_AGGREGATION_LABELS'; payload: string[] }
| { type: 'SET_FILTERS'; payload: TagFilter }
| { type: 'SET_FILTERS'; payload: string }
| { type: 'RESET_INSPECTION' }
| { type: 'APPLY_METRIC_INSPECTION_OPTIONS' };

View File

@@ -1,7 +1,7 @@
import { useCallback, useEffect, useMemo, useReducer, useState } from 'react';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import { useQuery } from 'react-query';
import { inspectMetrics } from 'api/generated/services/metrics';
import { themeColors } from 'constants/theme';
import { useGetInspectMetricsDetails } from 'hooks/metricsExplorer/useGetInspectMetricsDetails';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { generateColor } from 'lib/uPlotLib/utils/generateColor';
@@ -9,6 +9,7 @@ import { INITIAL_INSPECT_METRICS_OPTIONS } from './constants';
import {
GraphPopoverData,
InspectionStep,
InspectMetricsSeries,
MetricInspectionAction,
MetricInspectionState,
UseInspectMetricsReturnData,
@@ -61,7 +62,7 @@ const metricInspectionReducer = (
...state,
currentOptions: {
...state.currentOptions,
filters: action.payload,
filterExpression: action.payload,
},
};
case 'APPLY_METRIC_INSPECTION_OPTIONS':
@@ -100,26 +101,58 @@ export function useInspectMetrics(
);
const {
data: inspectMetricsData,
data: inspectMetricsResponse,
isLoading: isInspectMetricsLoading,
isError: isInspectMetricsError,
isRefetching: isInspectMetricsRefetching,
} = useGetInspectMetricsDetails(
{
metricName: metricName ?? '',
} = useQuery({
queryKey: [
'inspectMetrics',
metricName,
start,
end,
filters: metricInspectionOptions.appliedOptions.filters,
},
{
enabled: !!metricName,
keepPreviousData: true,
},
metricInspectionOptions.appliedOptions.filterExpression,
],
queryFn: ({ signal }) =>
inspectMetrics(
{
metricName: metricName ?? '',
start,
end,
filter: metricInspectionOptions.appliedOptions.filterExpression
? { expression: metricInspectionOptions.appliedOptions.filterExpression }
: undefined,
},
signal,
),
enabled: !!metricName,
keepPreviousData: true,
});
const inspectMetricsData = useMemo(
() => ({
series: (inspectMetricsResponse?.data?.series ?? []).map((s) => {
const labels: Record<string, string> = {};
for (const l of s.labels ?? []) {
if (l.key?.name) {
labels[l.key.name] = String(l.value ?? '');
}
}
return {
labels,
values: (s.values ?? []).map((v) => ({
timestamp: v.timestamp ?? 0,
value: String(v.value ?? 0),
})),
};
}) as InspectMetricsSeries[],
}),
[inspectMetricsResponse],
);
const isDarkMode = useIsDarkMode();
const inspectMetricsTimeSeries = useMemo(() => {
const series = inspectMetricsData?.payload?.data?.series ?? [];
const series = inspectMetricsData?.series ?? [];
return series.map((series, index) => {
const title = `TS${index + 1}`;
@@ -136,11 +169,6 @@ export function useInspectMetrics(
});
}, [inspectMetricsData, isDarkMode]);
const inspectMetricsStatusCode = useMemo(
() => inspectMetricsData?.statusCode || 200,
[inspectMetricsData],
);
// Evaluate inspection step
const currentInspectionStep = useMemo(() => {
if (metricInspectionOptions.currentOptions.spaceAggregationOption) {
@@ -231,7 +259,7 @@ export function useInspectMetrics(
const spaceAggregationLabels = useMemo(() => {
const labels = new Set<string>();
inspectMetricsData?.payload?.data.series.forEach((series) => {
inspectMetricsData?.series?.forEach((series) => {
Object.keys(series.labels).forEach((label) => {
labels.add(label);
});
@@ -250,7 +278,6 @@ export function useInspectMetrics(
return {
inspectMetricsTimeSeries,
inspectMetricsStatusCode,
isInspectMetricsLoading,
isInspectMetricsError,
formattedInspectMetricsTimeSeries,

View File

@@ -1,7 +1,7 @@
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import { InspectMetricsSeries } from 'api/metricsExplorer/getInspectMetricsDetails';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
import { InspectMetricsSeries } from './types';
import {
GraphPopoverData,
GraphPopoverOptions,

View File

@@ -5,7 +5,6 @@ import {
MetrictypesTemporalityDTO,
MetrictypesTypeDTO,
} from 'api/generated/services/sigNoz.schemas';
import { Temporality } from 'api/metricsExplorer/getMetricDetails';
import {
UniversalYAxisUnit,
YAxisUnitSelectorProps,
@@ -180,7 +179,10 @@ describe('Metadata', () => {
const temporalitySelect = screen.getByTestId('temporality-select');
expect(temporalitySelect).toBeInTheDocument();
await userEvent.selectOptions(temporalitySelect, Temporality.CUMULATIVE);
await userEvent.selectOptions(
temporalitySelect,
MetrictypesTemporalityDTO.cumulative,
);
const unitSelect = screen.getByTestId('unit-select');
expect(unitSelect).toBeInTheDocument();

View File

@@ -9,16 +9,17 @@ import {
Popover,
Spin,
} from 'antd';
import {
getListMetricsQueryKey,
useListMetrics,
} from 'api/generated/services/metrics';
import { Filter } from 'api/v5/v5';
import {
convertExpressionToFilters,
convertFiltersToExpression,
} from 'components/QueryBuilderV2/utils';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useGetMetricsListFilterValues } from 'hooks/metricsExplorer/useGetMetricsListFilterValues';
import useDebouncedFn from 'hooks/useDebouncedFunction';
import { Search } from 'lucide-react';
import { DataTypes } from 'types/api/queryBuilder/queryAutocompleteResponse';
function MetricNameSearch({
queryFilterExpression,
@@ -44,25 +45,27 @@ function MetricNameSearch({
}, [isPopoverOpen]);
const {
data: metricNameFilterValuesData,
data: metricNameListData,
isLoading: isLoadingMetricNameFilterValues,
isError: isErrorMetricNameFilterValues,
} = useGetMetricsListFilterValues(
} = useListMetrics(
{
searchText: debouncedSearchString,
filterKey: 'metric_name',
filterAttributeKeyDataType: DataTypes.String,
limit: 10,
},
{
enabled: isPopoverOpen,
refetchOnWindowFocus: false,
queryKey: [
REACT_QUERY_KEY.GET_METRICS_LIST_FILTER_VALUES,
'metric_name',
debouncedSearchString,
isPopoverOpen,
],
query: {
enabled: isPopoverOpen,
refetchOnWindowFocus: false,
queryKey: [
...getListMetricsQueryKey({
searchText: debouncedSearchString,
limit: 10,
}),
'search',
isPopoverOpen,
],
},
},
);
@@ -95,8 +98,8 @@ function MetricNameSearch({
);
const metricNameFilterValues = useMemo(
() => metricNameFilterValuesData?.payload?.data?.filterValues || [],
[metricNameFilterValuesData],
() => metricNameListData?.data?.metrics?.map((m) => m.metricName) || [],
[metricNameListData],
);
const handleKeyDown = useCallback(

View File

@@ -2,8 +2,8 @@
import { Provider } from 'react-redux';
import { MemoryRouter } from 'react-router-dom';
import { fireEvent, render, screen } from '@testing-library/react';
import * as metricsGeneratedAPI from 'api/generated/services/metrics';
import { Filter } from 'api/v5/v5';
import * as useGetMetricsListFilterValues from 'hooks/metricsExplorer/useGetMetricsListFilterValues';
import * as useQueryBuilderOperationsHooks from 'hooks/queryBuilder/useQueryBuilderOperations';
import store from 'store';
import APIError from 'types/api/error';
@@ -53,21 +53,33 @@ describe('MetricsTable', () => {
} as any);
});
jest
.spyOn(useGetMetricsListFilterValues, 'useGetMetricsListFilterValues')
.mockReturnValue({
jest.spyOn(metricsGeneratedAPI, 'useListMetrics').mockReturnValue(({
data: {
data: {
statusCode: 200,
payload: {
status: 'success',
data: {
filterValues: ['metric1', 'metric2'],
metrics: [
{
metricName: 'metric1',
description: '',
type: '',
unit: '',
temporality: '',
isMonotonic: false,
},
},
{
metricName: 'metric2',
description: '',
type: '',
unit: '',
temporality: '',
isMonotonic: false,
},
],
},
isLoading: false,
isError: false,
} as any);
status: 'success',
},
isLoading: false,
isError: false,
} as unknown) as ReturnType<typeof metricsGeneratedAPI.useListMetrics>);
it('renders table with data correctly', () => {
render(

View File

@@ -6,7 +6,6 @@ import {
MetricsexplorertypesTreemapEntryDTO,
MetricsexplorertypesTreemapModeDTO,
} from 'api/generated/services/sigNoz.schemas';
import { MetricsListPayload } from 'api/metricsExplorer/getMetricsList';
import { Filter } from 'api/v5/v5';
import { getUniversalNameFromMetricUnit } from 'components/YAxisUnitSelector/utils';
@@ -76,14 +75,6 @@ export const getMetricsTableColumns = (
},
];
export const getMetricsListQuery = (): MetricsListPayload => ({
filters: {
items: [],
op: 'and',
},
orderBy: { columnName: 'metric_name', order: 'asc' },
});
function ValidateRowValueWrapper({
value,
children,

View File

@@ -6,6 +6,7 @@ import { PANEL_GROUP_TYPES, PANEL_TYPES } from 'constants/queryBuilder';
import { themeColors } from 'constants/theme';
import { Card, CardContainer } from 'container/GridCardLayout/styles';
import DateTimeSelectionV2 from 'container/TopNav/DateTimeSelectionV2';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import {
CustomTimeType,
Time,
@@ -80,7 +81,7 @@ function PublicDashboardContainer({
const { widgets } = dashboard?.data || {};
const [selectedTimeRangeLabel, setSelectedTimeRangeLabel] = useState<string>(
publicDashboard?.defaultTimeRange || '30m',
publicDashboard?.defaultTimeRange || DEFAULT_TIME_RANGE,
);
const [selectedTimeRange, setSelectedTimeRange] = useState<{
@@ -88,7 +89,7 @@ function PublicDashboardContainer({
endTime: number;
}>(
getStartTimeAndEndTimeFromTimeRange(
publicDashboard?.defaultTimeRange || '30m',
publicDashboard?.defaultTimeRange || DEFAULT_TIME_RANGE,
),
);

View File

@@ -1,5 +1,6 @@
import { Layout } from 'react-grid-layout';
import { PANEL_GROUP_TYPES, PANEL_TYPES } from 'constants/queryBuilder';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { StatusCodes } from 'http-status-codes';
import {
@@ -153,7 +154,6 @@ afterEach(() => {
// Test constants
const MOCK_PUBLIC_DASHBOARD_ID = 'test-dashboard-id';
const MOCK_PUBLIC_PATH = '/public/dashboard/test';
const DEFAULT_TIME_RANGE = '30m';
// Use title from mock data
const TEST_DASHBOARD_TITLE = publicDashboardResponse.data.dashboard.data.title;
// Use widget ID from mock data

View File

@@ -1,5 +1,5 @@
import { render, screen } from '@testing-library/react';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
import { MetrictypesTypeDTO } from 'api/generated/services/sigNoz.schemas';
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
import { ReduceOperators } from 'types/common/queryBuilder';
@@ -47,7 +47,7 @@ describe('ReduceToFilter', () => {
spaceAggregation: 'sum',
},
],
aggregateAttribute: { key: 'test', type: MetricType.SUM },
aggregateAttribute: { key: 'test', type: MetrictypesTypeDTO.sum },
})}
onChange={mockOnChange}
/>,
@@ -61,7 +61,7 @@ describe('ReduceToFilter', () => {
<ReduceToFilter
query={baseQuery({
reduceTo: ReduceOperators.MAX,
aggregateAttribute: { key: 'test', type: MetricType.GAUGE },
aggregateAttribute: { key: 'test', type: MetrictypesTypeDTO.gauge },
})}
onChange={mockOnChange}
/>,

View File

@@ -2,6 +2,8 @@ import ROUTES from 'constants/routes';
import { CustomTimeType, Option, Time, TimeFrame } from './types';
export const DEFAULT_TIME_RANGE = '30m';
export const Options: Option[] = [
{ value: '5m', label: 'Last 5 minutes' },
{ value: '15m', label: 'Last 15 minutes' },
@@ -110,7 +112,7 @@ export const convertOldTimeToNewValidCustomTimeFormat = (
return `${match[1]}${unit}` as CustomTimeType;
}
return '30m';
return DEFAULT_TIME_RANGE;
};
export const getDefaultOption = (route: string): Time => {

View File

@@ -1,55 +0,0 @@
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import {
getInspectMetricsDetails,
InspectMetricsRequest,
InspectMetricsResponse,
} from 'api/metricsExplorer/getInspectMetricsDetails';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetInspectMetricsDetails = (
requestData: InspectMetricsRequest,
options?: UseQueryOptions<
SuccessResponse<InspectMetricsResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<InspectMetricsResponse> | ErrorResponse,
Error
>;
export const useGetInspectMetricsDetails: UseGetInspectMetricsDetails = (
requestData,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [
REACT_QUERY_KEY.GET_INSPECT_METRICS_DETAILS,
requestData.metricName,
requestData.start,
requestData.end,
requestData.filters,
];
}, [options?.queryKey, requestData]);
return useQuery<
SuccessResponse<InspectMetricsResponse> | ErrorResponse,
Error
>({
queryFn: ({ signal }) =>
getInspectMetricsDetails(requestData, signal, headers),
...options,
queryKey,
});
};

View File

@@ -1,46 +0,0 @@
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import {
getMetricDetails,
MetricDetailsResponse,
} from 'api/metricsExplorer/getMetricDetails';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricDetails = (
metricName: string,
options?: UseQueryOptions<
SuccessResponse<MetricDetailsResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricDetailsResponse> | ErrorResponse,
Error
>;
export const useGetMetricDetails: UseGetMetricDetails = (
metricName,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_METRIC_DETAILS, metricName];
}, [options?.queryKey, metricName]);
return useQuery<SuccessResponse<MetricDetailsResponse> | ErrorResponse, Error>(
{
queryFn: ({ signal }) => getMetricDetails(metricName, signal, headers),
...options,
queryKey,
},
);
};

View File

@@ -1,47 +0,0 @@
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import {
getMetricsList,
MetricsListPayload,
MetricsListResponse,
} from 'api/metricsExplorer/getMetricsList';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricsList = (
requestData: MetricsListPayload,
options?: UseQueryOptions<
SuccessResponse<MetricsListResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricsListResponse> | ErrorResponse,
Error
>;
export const useGetMetricsList: UseGetMetricsList = (
requestData,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_METRICS_LIST, requestData];
}, [options?.queryKey, requestData]);
return useQuery<SuccessResponse<MetricsListResponse> | ErrorResponse, Error>({
queryFn: ({ signal }) => getMetricsList(requestData, signal, headers),
...options,
queryKey,
});
};

View File

@@ -1,48 +0,0 @@
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import {
getMetricsListFilterKeys,
GetMetricsListFilterKeysParams,
MetricsListFilterKeysResponse,
} from 'api/metricsExplorer/getMetricsListFilterKeys';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricsListFilterKeys = (
params: GetMetricsListFilterKeysParams,
options?: UseQueryOptions<
SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse,
Error
>;
export const useGetMetricsListFilterKeys: UseGetMetricsListFilterKeys = (
params,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_METRICS_LIST_FILTER_KEYS];
}, [options?.queryKey]);
return useQuery<
SuccessResponse<MetricsListFilterKeysResponse> | ErrorResponse,
Error
>({
queryFn: ({ signal }) => getMetricsListFilterKeys(params, signal, headers),
...options,
queryKey,
});
};

View File

@@ -1,42 +0,0 @@
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import {
getMetricsListFilterValues,
MetricsListFilterValuesPayload,
MetricsListFilterValuesResponse,
} from 'api/metricsExplorer/getMetricsListFilterValues';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetMetricsListFilterValues = (
payload: MetricsListFilterValuesPayload,
options?: UseQueryOptions<
SuccessResponse<MetricsListFilterValuesResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<MetricsListFilterValuesResponse> | ErrorResponse,
Error
>;
export const useGetMetricsListFilterValues: UseGetMetricsListFilterValues = (
props,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
return [props];
}, [options?.queryKey, props]);
return useQuery<
SuccessResponse<MetricsListFilterValuesResponse> | ErrorResponse,
Error
>({
queryFn: ({ signal }) => getMetricsListFilterValues(props, signal, headers),
...options,
queryKey,
});
};

View File

@@ -1,48 +0,0 @@
import { useMemo } from 'react';
import { useQuery, UseQueryOptions, UseQueryResult } from 'react-query';
import {
getRelatedMetrics,
RelatedMetricsPayload,
RelatedMetricsResponse,
} from 'api/metricsExplorer/getRelatedMetrics';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { ErrorResponse, SuccessResponse } from 'types/api';
type UseGetRelatedMetrics = (
requestData: RelatedMetricsPayload,
options?: UseQueryOptions<
SuccessResponse<RelatedMetricsResponse> | ErrorResponse,
Error
>,
headers?: Record<string, string>,
) => UseQueryResult<
SuccessResponse<RelatedMetricsResponse> | ErrorResponse,
Error
>;
export const useGetRelatedMetrics: UseGetRelatedMetrics = (
requestData,
options,
headers,
) => {
const queryKey = useMemo(() => {
if (options?.queryKey && Array.isArray(options.queryKey)) {
return [...options.queryKey];
}
if (options?.queryKey && typeof options.queryKey === 'string') {
return options.queryKey;
}
return [REACT_QUERY_KEY.GET_RELATED_METRICS, requestData];
}, [options?.queryKey, requestData]);
return useQuery<
SuccessResponse<RelatedMetricsResponse> | ErrorResponse,
Error
>({
queryFn: ({ signal }) => getRelatedMetrics(requestData, signal, headers),
...options,
queryKey,
});
};

View File

@@ -1,7 +1,6 @@
/* eslint-disable sonarjs/cognitive-complexity */
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
import { useDebounce } from 'react-use';
import { getMetricsListFilterValues } from 'api/metricsExplorer/getMetricsListFilterValues';
import { getAttributesValues } from 'api/queryBuilder/getAttributesValues';
import { DATA_TYPE_VS_ATTRIBUTE_VALUES_KEY } from 'constants/queryBuilder';
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
@@ -14,7 +13,6 @@ import {
getTagToken,
isInNInOperator,
} from 'container/QueryBuilder/filters/QueryBuilderSearch/utils';
import { useGetMetricsListFilterKeys } from 'hooks/metricsExplorer/useGetMetricsListFilterKeys';
import useDebounceValue from 'hooks/useDebounce';
import { cloneDeep, isEqual, uniqWith, unset } from 'lodash-es';
import { IAttributeValuesResponse } from 'types/api/queryBuilder/getAttributesValues';
@@ -152,21 +150,8 @@ export const useFetchKeysAndValues = (
},
);
const {
data: metricsListFilterKeysData,
isFetching: isFetchingMetricsListFilterKeys,
status: fetchingMetricsListFilterKeysStatus,
} = useGetMetricsListFilterKeys(
{
searchText: searchKey,
},
{
enabled: isMetricsExplorer && isQueryEnabled && !shouldUseSuggestions,
queryKey: [searchKey],
},
);
function isAttributeValuesResponse(
// eslint-disable-next-line @typescript-eslint/no-explicit-any
payload: any,
): payload is IAttributeValuesResponse {
return (
@@ -180,14 +165,6 @@ export const useFetchKeysAndValues = (
);
}
function isMetricsListFilterValuesData(
payload: any,
): payload is { filterValues: string[] } {
return (
payload && 'filterValues' in payload && Array.isArray(payload.filterValues)
);
}
/**
* Fetches the options to be displayed based on the selected value
* @param value - the selected value
@@ -231,15 +208,6 @@ export const useFetchKeysAndValues = (
: tagValue?.toString() ?? '',
});
payload = response.payload;
} else if (isMetricsExplorer) {
const response = await getMetricsListFilterValues({
searchText: searchKey,
filterKey: filterAttributeKey?.key ?? tagKey,
filterAttributeKeyDataType:
filterAttributeKey?.dataType ?? DataTypes.EMPTY,
limit: 10,
});
payload = response.payload?.data;
} else {
const response = await getAttributesValues({
aggregateOperator: query.aggregateOperator || '',
@@ -256,18 +224,11 @@ export const useFetchKeysAndValues = (
payload = response.payload;
}
if (payload) {
if (isAttributeValuesResponse(payload)) {
const dataType = filterAttributeKey?.dataType ?? DataTypes.String;
const key = DATA_TYPE_VS_ATTRIBUTE_VALUES_KEY[dataType];
setResults(key ? payload[key] || [] : []);
return;
}
if (isMetricsExplorer && isMetricsListFilterValuesData(payload)) {
setResults(payload.filterValues || []);
return;
}
if (payload && isAttributeValuesResponse(payload)) {
const dataType = filterAttributeKey?.dataType ?? DataTypes.String;
const key = DATA_TYPE_VS_ATTRIBUTE_VALUES_KEY[dataType];
setResults(key ? payload[key] || [] : []);
return;
}
} catch (e) {
console.error(e);
@@ -305,32 +266,6 @@ export const useFetchKeysAndValues = (
}
}, [data?.payload?.attributeKeys, status]);
useEffect(() => {
if (
isMetricsExplorer &&
fetchingMetricsListFilterKeysStatus === 'success' &&
!isFetchingMetricsListFilterKeys &&
metricsListFilterKeysData?.payload?.data?.attributeKeys
) {
setKeys(metricsListFilterKeysData.payload.data.attributeKeys);
setSourceKeys((prevState) =>
uniqWith(
[
...(metricsListFilterKeysData.payload.data.attributeKeys ?? []),
...prevState,
],
isEqual,
),
);
}
}, [
metricsListFilterKeysData?.payload?.data?.attributeKeys,
fetchingMetricsListFilterKeysStatus,
isMetricsExplorer,
metricsListFilterKeysData,
isFetchingMetricsListFilterKeys,
]);
useEffect(() => {
if (
fetchingSuggestionsStatus === 'success' &&

View File

@@ -20,6 +20,7 @@ import AlertHistory from 'container/AlertHistory';
import { TIMELINE_TABLE_PAGE_SIZE } from 'container/AlertHistory/constants';
import { AlertDetailsTab, TimelineFilter } from 'container/AlertHistory/types';
import { urlKey } from 'container/AllError/utils';
import { DEFAULT_TIME_RANGE } from 'container/TopNav/DateTimeSelectionV2/constants';
import useAxiosError from 'hooks/useAxiosError';
import { useNotifications } from 'hooks/useNotifications';
import { useSafeNavigate } from 'hooks/useSafeNavigate';
@@ -46,8 +47,6 @@ import { PayloadProps } from 'types/api/alerts/get';
import { TagFilter } from 'types/api/queryBuilder/queryBuilderData';
import { nanoToMilli } from 'utils/timeUtils';
const DEFAULT_TIME_RANGE = '30m';
export const useAlertHistoryQueryParams = (): {
ruleId: string | null;
startTime: number;

View File

@@ -1,15 +0,0 @@
import { Temporality } from 'api/metricsExplorer/getMetricDetails';
import { MetricType } from 'api/metricsExplorer/getMetricsList';
export interface MetricMetadata {
description: string;
type: MetricType;
unit: string;
temporality: Temporality;
isMonotonic: boolean;
}
export interface MetricMetadataResponse {
status: string;
data: MetricMetadata;
}

11
go.mod
View File

@@ -36,7 +36,7 @@ require (
github.com/open-telemetry/opamp-go v0.22.0
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/stanza v0.144.0
github.com/openfga/api/proto v0.0.0-20250909172242-b4b2a12f5c67
github.com/openfga/language/pkg/go v0.2.0-beta.2.0.20250428093642-7aeebe78bbfe
github.com/openfga/language/pkg/go v0.2.0-beta.2.0.20251027165255-0f8f255e5f6c
github.com/opentracing/opentracing-go v1.2.0
github.com/pkg/errors v0.9.1
github.com/prometheus/alertmanager v0.31.0
@@ -87,10 +87,11 @@ require (
gopkg.in/yaml.v2 v2.4.0
gopkg.in/yaml.v3 v3.0.1
k8s.io/apimachinery v0.35.0
modernc.org/sqlite v1.39.1
modernc.org/sqlite v1.40.1
)
require (
github.com/IBM/pgxpoolprometheus v1.1.2 // indirect
github.com/aws/aws-sdk-go-v2 v1.41.1 // indirect
github.com/aws/aws-sdk-go-v2/config v1.32.7 // indirect
github.com/aws/aws-sdk-go-v2/credentials v1.19.7 // indirect
@@ -214,7 +215,7 @@ require (
github.com/gopherjs/gopherjs v1.17.2 // indirect
github.com/grafana/regexp v0.0.0-20250905093917-f7b3be9d1853 // indirect
github.com/grpc-ecosystem/go-grpc-middleware v1.4.0 // indirect
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.2 // indirect
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.3 // indirect
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.7 // indirect
github.com/hashicorp/errwrap v1.1.0 // indirect
github.com/hashicorp/go-immutable-radix v1.3.1 // indirect
@@ -267,14 +268,14 @@ require (
github.com/open-telemetry/opentelemetry-collector-contrib/internal/exp/metrics v0.145.0 // indirect
github.com/open-telemetry/opentelemetry-collector-contrib/pkg/pdatautil v0.145.0 // indirect
github.com/open-telemetry/opentelemetry-collector-contrib/processor/deltatocumulativeprocessor v0.145.0 // indirect
github.com/openfga/openfga v1.10.1
github.com/openfga/openfga v1.11.2
github.com/paulmach/orb v0.11.1 // indirect
github.com/pelletier/go-toml/v2 v2.2.4 // indirect
github.com/pierrec/lz4/v4 v4.1.23 // indirect
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c // indirect
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55 // indirect
github.com/pressly/goose/v3 v3.25.0 // indirect
github.com/pressly/goose/v3 v3.26.0 // indirect
github.com/prometheus/client_model v0.6.2 // indirect
github.com/prometheus/exporter-toolkit v0.15.1 // indirect
github.com/prometheus/otlptranslator v1.0.0 // indirect

22
go.sum
View File

@@ -97,6 +97,8 @@ github.com/DATA-DOG/go-sqlmock v1.5.2 h1:OcvFkGmslmlZibjAjaHm3L//6LiuBgolP7Oputl
github.com/DATA-DOG/go-sqlmock v1.5.2/go.mod h1:88MAG/4G7SMwSE3CeA0ZKzrT5CiOU3OJ+JlNzwDqpNU=
github.com/DataDog/datadog-go v3.2.0+incompatible/go.mod h1:LButxg5PwREeZtORoXG3tL4fMGNddJ+vMq1mwgfaqoQ=
github.com/DataDog/datadog-go v3.7.1+incompatible/go.mod h1:LButxg5PwREeZtORoXG3tL4fMGNddJ+vMq1mwgfaqoQ=
github.com/IBM/pgxpoolprometheus v1.1.2 h1:sHJwxoL5Lw4R79Zt+H4Uj1zZ4iqXJLdk7XDE7TPs97U=
github.com/IBM/pgxpoolprometheus v1.1.2/go.mod h1:+vWzISN6S9ssgurhUNmm6AlXL9XLah3TdWJktquKTR8=
github.com/Masterminds/squirrel v1.5.4 h1:uUcX/aBc8O7Fg9kaISIUsHXdKuqehiXAMQTYX8afzqM=
github.com/Masterminds/squirrel v1.5.4/go.mod h1:NNaOrjSoIDfDA40n7sr2tPNZRfjzjA400rg+riTZj10=
github.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=
@@ -566,8 +568,8 @@ github.com/grafana/regexp v0.0.0-20250905093917-f7b3be9d1853 h1:cLN4IBkmkYZNnk7E
github.com/grafana/regexp v0.0.0-20250905093917-f7b3be9d1853/go.mod h1:+JKpmjMGhpgPL+rXZ5nsZieVzvarn86asRlBg4uNGnk=
github.com/grpc-ecosystem/go-grpc-middleware v1.4.0 h1:UH//fgunKIs4JdUbpDl1VZCDaL56wXCB/5+wF6uHfaI=
github.com/grpc-ecosystem/go-grpc-middleware v1.4.0/go.mod h1:g5qyo/la0ALbONm6Vbp88Yd8NsDy6rZz+RcrMPxvld8=
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.2 h1:sGm2vDRFUrQJO/Veii4h4zG2vvqG6uWNkBHSTqXOZk0=
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.2/go.mod h1:wd1YpapPLivG6nQgbf7ZkG1hhSOXDhhn4MLTknx2aAc=
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.3 h1:B+8ClL/kCQkRiU82d9xajRPKYMrB7E0MbtzWVi1K4ns=
github.com/grpc-ecosystem/go-grpc-middleware/v2 v2.3.3/go.mod h1:NbCUVmiS4foBGBHOYlCT25+YmGpJ32dZPi75pGEUpj4=
github.com/grpc-ecosystem/go-grpc-prometheus v1.2.0/go.mod h1:8NvIoxWQoOIhqOTXgfV/d3M/q6VIi02HzZEHgUlZvzk=
github.com/grpc-ecosystem/grpc-gateway v1.16.0/go.mod h1:BDjrQk3hbvj6Nolgz8mAMFbcEtjT1g+wF4CSlocrBnw=
github.com/grpc-ecosystem/grpc-gateway/v2 v2.27.7 h1:X+2YciYSxvMQK0UZ7sg45ZVabVZBeBuvMkmuI2V3Fak=
@@ -867,10 +869,10 @@ github.com/opencontainers/image-spec v1.1.1 h1:y0fUlFfIZhPF1W537XOLg0/fcx6zcHCJw
github.com/opencontainers/image-spec v1.1.1/go.mod h1:qpqAh3Dmcf36wStyyWU+kCeDgrGnAve2nCC8+7h8Q0M=
github.com/openfga/api/proto v0.0.0-20250909172242-b4b2a12f5c67 h1:58mhO5nqkdka2Mpg5mijuZOHScX7reowhzRciwjFCU8=
github.com/openfga/api/proto v0.0.0-20250909172242-b4b2a12f5c67/go.mod h1:XDX4qYNBUM2Rsa2AbKPh+oocZc2zgme+EF2fFC6amVU=
github.com/openfga/language/pkg/go v0.2.0-beta.2.0.20250428093642-7aeebe78bbfe h1:X1g0rBUMvvzMudsak/jmoEZ1NhSsp6yR0VGxWHnGMzs=
github.com/openfga/language/pkg/go v0.2.0-beta.2.0.20250428093642-7aeebe78bbfe/go.mod h1:5Z0pbTT7Jz/oQFLfadb+C5t5NwHrduAO7j7L07Ec1GM=
github.com/openfga/openfga v1.10.1 h1:iznHh7fgmJO+XhWPOJbPUwmE2r1ruoCRgjpPiB2D164=
github.com/openfga/openfga v1.10.1/go.mod h1:LAcl94t0m+2w2cP9VWmQkwAnn0jF9tsf4Oio0n/iaAE=
github.com/openfga/language/pkg/go v0.2.0-beta.2.0.20251027165255-0f8f255e5f6c h1:xPbHNFG8QbPr/fpL7u0MPI0x74/BCLm7Sx02btL1m5Q=
github.com/openfga/language/pkg/go v0.2.0-beta.2.0.20251027165255-0f8f255e5f6c/go.mod h1:BG26d1Fk4GSg0wMj60TRJ6Pe4ka2WQ33akhO+mzt3t0=
github.com/openfga/openfga v1.11.2 h1:6vFZSSE0pyyt9qz320BgQLh/sHxZY5nfPOcJ3d5g8Bg=
github.com/openfga/openfga v1.11.2/go.mod h1:aCDb0gaWsU6dDAdC+zNOR2XC2W3lteGwKSkRWcSjGW8=
github.com/opentracing/opentracing-go v1.1.0/go.mod h1:UkNAQd3GIcIGf0SeVgPpRdFStlNbqXla1AfSYxPUl2o=
github.com/opentracing/opentracing-go v1.2.0 h1:uEJPy/1a5RIPAJ0Ov+OIO8OxWu77jEv+1B0VhjKrZUs=
github.com/opentracing/opentracing-go v1.2.0/go.mod h1:GxEUsuufX4nBwe+T+Wl9TAgYrxe9dPLANfrWvHYVTgc=
@@ -909,8 +911,8 @@ github.com/posener/complete v1.1.1/go.mod h1:em0nMJCgc9GFtwrmVmEMR/ZL6WyhyjMBndr
github.com/posener/complete v1.2.3/go.mod h1:WZIdtGGp+qx0sLrYKtIRAruyNpv6hFCicSgv7Sy7s/s=
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55 h1:o4JXh1EVt9k/+g42oCprj/FisM4qX9L3sZB3upGN2ZU=
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55/go.mod h1:OmDBASR4679mdNQnz2pUhc2G8CO2JrUAVFDRBDP/hJE=
github.com/pressly/goose/v3 v3.25.0 h1:6WeYhMWGRCzpyd89SpODFnCBCKz41KrVbRT58nVjGng=
github.com/pressly/goose/v3 v3.25.0/go.mod h1:4hC1KrritdCxtuFsqgs1R4AU5bWtTAf+cnWvfhf2DNY=
github.com/pressly/goose/v3 v3.26.0 h1:KJakav68jdH0WDvoAcj8+n61WqOIaPGgH0bJWS6jpmM=
github.com/pressly/goose/v3 v3.26.0/go.mod h1:4hC1KrritdCxtuFsqgs1R4AU5bWtTAf+cnWvfhf2DNY=
github.com/prometheus/alertmanager v0.31.0 h1:DQW02uIUNNiAa9AD9VA5xaFw5D+xrV+bocJc4gN9bEU=
github.com/prometheus/alertmanager v0.31.0/go.mod h1:zWPQwhbLt2ybee8rL921UONeQ59Oncash+m/hGP17tU=
github.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=
@@ -1945,8 +1947,8 @@ modernc.org/opt v0.1.4 h1:2kNGMRiUjrp4LcaPuLY2PzUfqM/w9N23quVwhKt5Qm8=
modernc.org/opt v0.1.4/go.mod h1:03fq9lsNfvkYSfxrfUhZCWPk1lm4cq4N+Bh//bEtgns=
modernc.org/sortutil v1.2.1 h1:+xyoGf15mM3NMlPDnFqrteY07klSFxLElE2PVuWIJ7w=
modernc.org/sortutil v1.2.1/go.mod h1:7ZI3a3REbai7gzCLcotuw9AC4VZVpYMjDzETGsSMqJE=
modernc.org/sqlite v1.39.1 h1:H+/wGFzuSCIEVCvXYVHX5RQglwhMOvtHSv+VtidL2r4=
modernc.org/sqlite v1.39.1/go.mod h1:9fjQZ0mB1LLP0GYrp39oOJXx/I2sxEnZtzCmEQIKvGE=
modernc.org/sqlite v1.40.1 h1:VfuXcxcUWWKRBuP8+BR9L7VnmusMgBNNnBYGEe9w/iY=
modernc.org/sqlite v1.40.1/go.mod h1:9fjQZ0mB1LLP0GYrp39oOJXx/I2sxEnZtzCmEQIKvGE=
modernc.org/strutil v1.2.1 h1:UneZBkQA+DX2Rp35KcM69cSsNES9ly8mQWD71HKlOA0=
modernc.org/strutil v1.2.1/go.mod h1:EHkiggD70koQxjVdSBM3JKM7k6L0FbGE5eymy9i3B9A=
modernc.org/token v1.1.0 h1:Xl7Ap9dKaEs5kLoOQeQmPWevfnk/DM5qcLcYlA8ys6Y=

View File

@@ -47,7 +47,7 @@ type Dispatcher struct {
receiverRoutes map[string]*dispatch.Route
}
// We use the upstream Limits interface from Prometheus
// We use the upstream Limits interface from Prometheus.
type Limits = dispatch.Limits
// NewDispatcher returns a new Dispatcher.
@@ -273,7 +273,7 @@ type notifyFunc func(context.Context, ...*types.Alert) bool
// processAlert determines in which aggregation group the alert falls
// and inserts it.
// no data alert will only have ruleId and no data label
// no data alert will only have ruleId and no data label.
func (d *Dispatcher) processAlert(alert *types.Alert, route *dispatch.Route) {
ruleId := getRuleIDFromAlert(alert)
config, err := d.notificationManager.GetNotificationConfig(d.orgID, ruleId)
@@ -510,7 +510,7 @@ func (ag *aggrGroup) flush(notify func(...*types.Alert) bool) {
}
}
// unlimitedLimits provides unlimited aggregation groups for SigNoz
// unlimitedLimits provides unlimited aggregation groups for SigNoz.
type unlimitedLimits struct{}
func (u *unlimitedLimits) MaxNumberOfAggregationGroups() int { return 0 }

View File

@@ -169,19 +169,20 @@ func TestEndToEndAlertManagerFlow(t *testing.T) {
require.Len(t, alerts, 3, "Expected 3 active alerts")
for _, alert := range alerts {
require.Equal(t, "high-cpu-usage", alert.Alert.Labels["ruleId"])
require.NotEmpty(t, alert.Alert.Labels["severity"])
require.Contains(t, []string{"critical", "warning"}, alert.Alert.Labels["severity"])
require.Equal(t, "prod-cluster", alert.Alert.Labels["cluster"])
require.NotEmpty(t, alert.Alert.Labels["instance"])
require.Equal(t, "high-cpu-usage", alert.Labels["ruleId"])
require.NotEmpty(t, alert.Labels["severity"])
require.Contains(t, []string{"critical", "warning"}, alert.Labels["severity"])
require.Equal(t, "prod-cluster", alert.Labels["cluster"])
require.NotEmpty(t, alert.Labels["instance"])
}
criticalAlerts := 0
warningAlerts := 0
for _, alert := range alerts {
if alert.Alert.Labels["severity"] == "critical" {
switch alert.Labels["severity"] {
case "critical":
criticalAlerts++
} else if alert.Alert.Labels["severity"] == "warning" {
case "warning":
warningAlerts++
}
}

View File

@@ -124,7 +124,7 @@ func TestServerPutAlerts(t *testing.T) {
require.NoError(t, err)
assert.Equal(t, 1, len(gettableAlerts))
assert.Equal(t, gettableAlerts[0].Alert.Labels["alertname"], "test-alert")
assert.Equal(t, gettableAlerts[0].Labels["alertname"], "test-alert")
assert.NoError(t, server.Stop(context.Background()))
}

View File

@@ -9,7 +9,7 @@ type DispatcherMetrics struct {
}
// NewDispatcherMetrics returns a new registered DispatchMetrics.
// todo(aniketio-ctrl): change prom metrics to otel metrics
// todo(aniketio-ctrl): change prom metrics to otel metrics.
func NewDispatcherMetrics(registerLimitMetrics bool, r prometheus.Registerer) *DispatcherMetrics {
m := DispatcherMetrics{
aggrGroups: prometheus.NewGauge(

View File

@@ -9,7 +9,7 @@ import (
"github.com/prometheus/common/model"
)
// MockNotificationManager is a simple mock implementation of NotificationManager
// MockNotificationManager is a simple mock implementation of NotificationManager.
type MockNotificationManager struct {
configs map[string]*alertmanagertypes.NotificationConfig
routes map[string]*alertmanagertypes.RoutePolicy
@@ -17,7 +17,7 @@ type MockNotificationManager struct {
errors map[string]error
}
// NewMock creates a new mock notification manager
// NewMock creates a new mock notification manager.
func NewMock() *MockNotificationManager {
return &MockNotificationManager{
configs: make(map[string]*alertmanagertypes.NotificationConfig),

View File

@@ -216,7 +216,7 @@ func (r *provider) Match(ctx context.Context, orgID string, ruleID string, set m
// the nested structure takes precedence. That means we will replace an existing leaf at any
// intermediate path with a map so we can materialize the deeper structure.
// TODO(srikanthccv): we need a better solution to handle this, remove the following
// when we update the expr to support dotted keys
// when we update the expr to support dotted keys.
func (r *provider) convertLabelSetToEnv(ctx context.Context, labelSet model.LabelSet) map[string]interface{} {
env := make(map[string]interface{})

View File

@@ -217,7 +217,7 @@ func (provider *provider) GetConfig(ctx context.Context, orgID string) (*alertma
}
func (provider *provider) SetDefaultConfig(ctx context.Context, orgID string) error {
config, err := alertmanagertypes.NewDefaultConfig(provider.config.Signoz.Config.Global, provider.config.Signoz.Config.Route, orgID)
config, err := alertmanagertypes.NewDefaultConfig(provider.config.Signoz.Global, provider.config.Signoz.Route, orgID)
if err != nil {
return err
}

View File

@@ -183,5 +183,43 @@ func (provider *provider) addMetricsExplorerRoutes(router *mux.Router) error {
return err
}
if err := router.Handle("/api/v2/metrics/inspect", handler.New(
provider.authZ.ViewAccess(provider.metricsExplorerHandler.InspectMetrics),
handler.OpenAPIDef{
ID: "InspectMetrics",
Tags: []string{"metrics"},
Summary: "Inspect raw metric data points",
Description: "Returns raw time series data points for a metric within a time range (max 30 minutes). Each series includes labels and timestamp/value pairs.",
Request: new(metricsexplorertypes.InspectMetricsRequest),
RequestContentType: "application/json",
Response: new(metricsexplorertypes.InspectMetricsResponse),
ResponseContentType: "application/json",
SuccessStatusCode: http.StatusOK,
ErrorStatusCodes: []int{http.StatusBadRequest, http.StatusUnauthorized, http.StatusInternalServerError},
Deprecated: false,
SecuritySchemes: newSecuritySchemes(types.RoleViewer),
})).Methods(http.MethodPost).GetError(); err != nil {
return err
}
if err := router.Handle("/api/v2/metrics/onboarding", handler.New(
provider.authZ.ViewAccess(provider.metricsExplorerHandler.GetOnboardingStatus),
handler.OpenAPIDef{
ID: "GetMetricsOnboardingStatus",
Tags: []string{"metrics"},
Summary: "Check if non-SigNoz metrics have been received",
Description: "Lightweight endpoint that checks if any non-SigNoz metrics have been ingested, used for onboarding status detection",
Request: nil,
RequestContentType: "",
Response: new(metricsexplorertypes.MetricsOnboardingResponse),
ResponseContentType: "application/json",
SuccessStatusCode: http.StatusOK,
ErrorStatusCodes: []int{http.StatusUnauthorized, http.StatusInternalServerError},
Deprecated: false,
SecuritySchemes: newSecuritySchemes(types.RoleViewer),
})).Methods(http.MethodGet).GetError(); err != nil {
return err
}
return nil
}

15
pkg/auditor/auditor.go Normal file
View File

@@ -0,0 +1,15 @@
package auditor
import (
"context"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/types/audittypes"
)
type Auditor interface {
factory.ServiceWithHealthy
// Audit emits an audit event. It is fire-and-forget: callers never block on audit outcomes.
Audit(ctx context.Context, event audittypes.AuditEvent)
}

View File

@@ -0,0 +1,17 @@
package auditorserver
import "time"
type Config struct {
// BufferSize is the maximum number of events that can be buffered.
// When full, new events are dropped (fail-open).
BufferSize int
// BatchSize is the maximum number of events per export batch.
BatchSize int
// FlushInterval is the maximum time between flushes.
// A flush is triggered when either BatchSize events accumulate or
// FlushInterval elapses, whichever comes first.
FlushInterval time.Duration
}

View File

@@ -0,0 +1,211 @@
package auditorserver
import (
"context"
"log/slog"
"sync"
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/types/audittypes"
"go.opentelemetry.io/otel/attribute"
"go.opentelemetry.io/otel/codes"
"go.opentelemetry.io/otel/metric"
"go.opentelemetry.io/otel/trace"
)
var _ factory.ServiceWithHealthy = (*Server)(nil)
// ExportFunc is called by the server to export a batch of audit events.
// The context carries the active trace span for the export operation.
type ExportFunc func(ctx context.Context, events []audittypes.AuditEvent) error
// Server buffers audit events and flushes them in batches.
// A flush is triggered when either BatchSize events accumulate or
// FlushInterval elapses, whichever comes first.
type Server struct {
// settings provides logger, meter, and tracer for instrumentation.
settings factory.ScopedProviderSettings
// config holds buffer size, batch size, and flush interval.
config Config
// exportFn is called with each batch of events ready for export.
exportFn ExportFunc
// queue holds buffered events waiting to be batched.
queue []audittypes.AuditEvent
// queueMtx guards access to queue.
queueMtx sync.Mutex
// moreC signals the flush goroutine that new events are available.
moreC chan struct{}
// healthyC is closed once Start has registered the flush goroutine.
// Also serves as the Healthy() signal for factory.ServiceWithHealthy.
healthyC chan struct{}
// stopC signals the flush goroutine to drain and shut down.
stopC chan struct{}
// goroutinesWg tracks the background flush goroutine.
goroutinesWg sync.WaitGroup
// metrics holds OTel counters and gauges for observability.
metrics *serverMetrics
}
func New(settings factory.ScopedProviderSettings, config Config, exportFn ExportFunc) (*Server, error) {
metrics, err := newServerMetrics(settings.Meter())
if err != nil {
return nil, err
}
server := &Server{
settings: settings,
config: config,
metrics: metrics,
exportFn: exportFn,
queue: make([]audittypes.AuditEvent, 0, config.BufferSize),
moreC: make(chan struct{}, 1),
healthyC: make(chan struct{}),
stopC: make(chan struct{}),
}
_, err = settings.Meter().RegisterCallback(func(_ context.Context, o metric.Observer) error {
o.ObserveInt64(server.metrics.bufferSize, int64(server.queueLen()))
return nil
}, server.metrics.bufferSize)
if err != nil {
return nil, err
}
return server, nil
}
// Start runs the background flush loop. It blocks until Stop is called.
func (server *Server) Start(ctx context.Context) error {
server.goroutinesWg.Add(1)
close(server.healthyC)
go func() {
defer server.goroutinesWg.Done()
ticker := time.NewTicker(server.config.FlushInterval)
defer ticker.Stop()
for {
select {
case <-server.stopC:
server.drain(ctx)
return
case <-server.moreC:
if server.queueLen() >= server.config.BatchSize {
server.flush(ctx)
}
case <-ticker.C:
server.flush(ctx)
}
}
}()
server.goroutinesWg.Wait()
return nil
}
// Add enqueues an audit event for batched export.
// If the buffer is full the event is dropped and a warning is logged.
func (server *Server) Add(ctx context.Context, event audittypes.AuditEvent) {
ctx, span := server.settings.Tracer().Start(ctx, "auditorserver.Add", trace.WithAttributes(attribute.String("audit.event_name", event.EventName.String())))
defer span.End()
server.queueMtx.Lock()
defer server.queueMtx.Unlock()
if len(server.queue) >= server.config.BufferSize {
server.metrics.eventsDropped.Add(ctx, 1)
span.SetAttributes(attribute.Bool("audit.dropped", true))
server.settings.Logger().WarnContext(ctx, "audit event dropped, buffer full", slog.Int("audit_buffer_size", server.config.BufferSize))
return
}
server.queue = append(server.queue, event)
server.setMore()
}
// Healthy returns a channel that is closed once the server is ready to accept events.
func (server *Server) Healthy() <-chan struct{} {
return server.healthyC
}
// Stop signals the background loop to drain remaining events and shut down.
// It blocks until all buffered events have been exported.
func (server *Server) Stop(ctx context.Context) error {
<-server.healthyC
close(server.stopC)
server.goroutinesWg.Wait()
return nil
}
func (server *Server) queueLen() int {
server.queueMtx.Lock()
defer server.queueMtx.Unlock()
return len(server.queue)
}
func (server *Server) export(ctx context.Context, events []audittypes.AuditEvent) {
ctx, span := server.settings.Tracer().Start(ctx, "auditorserver.Export", trace.WithAttributes(attribute.Int("audit.batch_size", len(events))))
defer span.End()
server.metrics.eventsEmitted.Add(ctx, int64(len(events)))
if err := server.exportFn(ctx, events); err != nil {
server.metrics.writeErrors.Add(ctx, 1)
span.RecordError(err)
span.SetStatus(codes.Error, err.Error())
server.settings.Logger().ErrorContext(ctx, "audit batch export failed", errors.Attr(err), slog.Int("audit_batch_size", len(events)))
}
}
func (server *Server) flush(ctx context.Context) {
events := server.next()
if len(events) == 0 {
return
}
server.export(ctx, events)
}
func (server *Server) drain(ctx context.Context) {
for server.queueLen() > 0 {
events := server.next()
if len(events) == 0 {
return
}
server.export(ctx, events)
}
}
func (server *Server) next() []audittypes.AuditEvent {
server.queueMtx.Lock()
defer server.queueMtx.Unlock()
if len(server.queue) == 0 {
return nil
}
n := min(server.config.BatchSize, len(server.queue))
batch := make([]audittypes.AuditEvent, n)
copy(batch, server.queue[:n])
server.queue = server.queue[n:]
return batch
}
func (server *Server) setMore() {
select {
case server.moreC <- struct{}{}:
default:
}
}

View File

@@ -0,0 +1,219 @@
package auditorserver
import (
"context"
"sync"
"sync/atomic"
"testing"
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/instrumentation/instrumentationtest"
"github.com/SigNoz/signoz/pkg/types/audittypes"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)
func newTestSettings() factory.ScopedProviderSettings {
return factory.NewScopedProviderSettings(instrumentationtest.New().ToProviderSettings(), "auditorserver_test")
}
func newTestEvent(resource string, action audittypes.Action) audittypes.AuditEvent {
return audittypes.AuditEvent{
Timestamp: time.Now(),
EventName: audittypes.NewEventName(resource, action),
ResourceName: resource,
Action: action,
Outcome: audittypes.OutcomeSuccess,
}
}
func TestNew(t *testing.T) {
settings := newTestSettings()
config := Config{BufferSize: 10, BatchSize: 5, FlushInterval: time.Second}
server, err := New(settings, config, func(_ context.Context, _ []audittypes.AuditEvent) error { return nil })
require.NoError(t, err)
assert.NotNil(t, server)
}
func TestStart_Stop(t *testing.T) {
settings := newTestSettings()
config := Config{BufferSize: 10, BatchSize: 5, FlushInterval: time.Second}
server, err := New(settings, config, func(_ context.Context, _ []audittypes.AuditEvent) error { return nil })
require.NoError(t, err)
done := make(chan error, 1)
go func() { done <- server.Start(context.Background()) }()
require.NoError(t, server.Stop(context.Background()))
select {
case err := <-done:
assert.NoError(t, err)
case <-time.After(2 * time.Second):
assert.Fail(t, "Start did not return after Stop")
}
}
func TestAdd_FlushesOnBatchSize(t *testing.T) {
var exported []audittypes.AuditEvent
var mu sync.Mutex
settings := newTestSettings()
config := Config{BufferSize: 100, BatchSize: 3, FlushInterval: time.Hour}
server, err := New(settings, config, func(_ context.Context, events []audittypes.AuditEvent) error {
mu.Lock()
exported = append(exported, events...)
mu.Unlock()
return nil
})
require.NoError(t, err)
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
go func() { _ = server.Start(ctx) }()
for i := 0; i < 3; i++ {
server.Add(ctx, newTestEvent("dashboard", audittypes.ActionCreate))
}
assert.Eventually(t, func() bool {
mu.Lock()
defer mu.Unlock()
return len(exported) == 3
}, 2*time.Second, 10*time.Millisecond)
require.NoError(t, server.Stop(ctx))
}
func TestAdd_FlushesOnInterval(t *testing.T) {
var exported atomic.Int64
settings := newTestSettings()
config := Config{BufferSize: 100, BatchSize: 1000, FlushInterval: 50 * time.Millisecond}
server, err := New(settings, config, func(_ context.Context, events []audittypes.AuditEvent) error {
exported.Add(int64(len(events)))
return nil
})
require.NoError(t, err)
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
go func() { _ = server.Start(ctx) }()
server.Add(ctx, newTestEvent("user", audittypes.ActionUpdate))
assert.Eventually(t, func() bool {
return exported.Load() == 1
}, 2*time.Second, 10*time.Millisecond)
require.NoError(t, server.Stop(ctx))
}
func TestAdd_DropsWhenBufferFull(t *testing.T) {
settings := newTestSettings()
config := Config{BufferSize: 2, BatchSize: 100, FlushInterval: time.Hour}
server, err := New(settings, config, func(_ context.Context, _ []audittypes.AuditEvent) error { return nil })
require.NoError(t, err)
ctx := context.Background()
server.Add(ctx, newTestEvent("dashboard", audittypes.ActionCreate))
server.Add(ctx, newTestEvent("dashboard", audittypes.ActionUpdate))
server.Add(ctx, newTestEvent("dashboard", audittypes.ActionDelete))
assert.Equal(t, 2, server.queueLen())
}
func TestStop_DrainsRemainingEvents(t *testing.T) {
var exported atomic.Int64
settings := newTestSettings()
config := Config{BufferSize: 100, BatchSize: 100, FlushInterval: time.Hour}
server, err := New(settings, config, func(_ context.Context, events []audittypes.AuditEvent) error {
exported.Add(int64(len(events)))
return nil
})
require.NoError(t, err)
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
go func() { _ = server.Start(ctx) }()
for i := 0; i < 5; i++ {
server.Add(ctx, newTestEvent("alert-rule", audittypes.ActionCreate))
}
require.NoError(t, server.Stop(ctx))
assert.Equal(t, int64(5), exported.Load())
}
func TestAdd_ContinuesAfterExportFailure(t *testing.T) {
var calls atomic.Int64
settings := newTestSettings()
config := Config{BufferSize: 100, BatchSize: 2, FlushInterval: time.Hour}
server, err := New(settings, config, func(_ context.Context, _ []audittypes.AuditEvent) error {
calls.Add(1)
return errors.New(errors.TypeInternal, errors.CodeInternal, "connection refused")
})
require.NoError(t, err)
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
go func() { _ = server.Start(ctx) }()
server.Add(ctx, newTestEvent("user", audittypes.ActionDelete))
server.Add(ctx, newTestEvent("user", audittypes.ActionDelete))
assert.Eventually(t, func() bool {
return calls.Load() >= 1
}, 2*time.Second, 10*time.Millisecond)
require.NoError(t, server.Stop(ctx))
}
func TestAdd_ConcurrentSafety(t *testing.T) {
var exported atomic.Int64
settings := newTestSettings()
config := Config{BufferSize: 1000, BatchSize: 10, FlushInterval: 50 * time.Millisecond}
server, err := New(settings, config, func(_ context.Context, events []audittypes.AuditEvent) error {
exported.Add(int64(len(events)))
return nil
})
require.NoError(t, err)
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
go func() { _ = server.Start(ctx) }()
var wg sync.WaitGroup
for i := 0; i < 100; i++ {
wg.Add(1)
go func() {
defer wg.Done()
server.Add(ctx, newTestEvent("dashboard", audittypes.ActionCreate))
}()
}
wg.Wait()
require.NoError(t, server.Stop(ctx))
assert.Equal(t, int64(100), exported.Load())
}

View File

@@ -0,0 +1,48 @@
package auditorserver
import (
"github.com/SigNoz/signoz/pkg/errors"
"go.opentelemetry.io/otel/metric"
)
type serverMetrics struct {
eventsEmitted metric.Int64Counter
writeErrors metric.Int64Counter
eventsDropped metric.Int64Counter
bufferSize metric.Int64ObservableGauge
}
func newServerMetrics(meter metric.Meter) (*serverMetrics, error) {
var errs error
eventsEmitted, err := meter.Int64Counter("signoz.audit.events.emitted", metric.WithDescription("Total number of audit events emitted for export."))
if err != nil {
errs = errors.Join(errs, err)
}
writeErrors, err := meter.Int64Counter("signoz.audit.store.write_errors", metric.WithDescription("Total number of audit store write errors during export."))
if err != nil {
errs = errors.Join(errs, err)
}
eventsDropped, err := meter.Int64Counter("signoz.audit.events.dropped", metric.WithDescription("Total number of audit events dropped due to a full buffer."))
if err != nil {
errs = errors.Join(errs, err)
}
bufferSize, err := meter.Int64ObservableGauge("signoz.audit.events.buffer_size", metric.WithDescription("Current number of audit events buffered for export."))
if err != nil {
errs = errors.Join(errs, err)
}
if errs != nil {
return nil, errs
}
return &serverMetrics{
eventsEmitted: eventsEmitted,
writeErrors: writeErrors,
eventsDropped: eventsDropped,
bufferSize: bufferSize,
}, nil
}

106
pkg/auditor/config.go Normal file
View File

@@ -0,0 +1,106 @@
package auditor
import (
"time"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
)
var _ factory.Config = (*Config)(nil)
type Config struct {
Provider string `mapstructure:"provider"`
// BufferSize is the async channel capacity for audit events.
// Events are dropped when the buffer is full (fail-open).
BufferSize int `mapstructure:"buffer_size"`
// BatchSize is the maximum number of events per export batch.
BatchSize int `mapstructure:"batch_size"`
// FlushInterval is the maximum time between export flushes.
FlushInterval time.Duration `mapstructure:"flush_interval"`
OTLPHTTP OTLPHTTPConfig `mapstructure:"otlphttp"`
}
// OTLPHTTPConfig holds configuration for the OTLP HTTP exporter provider.
// Fields map to go.opentelemetry.io/otel/exporters/otlp/otlplog/otlploghttp options.
type OTLPHTTPConfig struct {
// Endpoint is the target host:port (without scheme or path).
Endpoint string `mapstructure:"endpoint"`
// URLPath overrides the default URL path (/v1/logs).
URLPath string `mapstructure:"url_path"`
// Insecure disables TLS, using HTTP instead of HTTPS.
Insecure bool `mapstructure:"insecure"`
// Compression sets the compression strategy. Supported: "none", "gzip".
Compression string `mapstructure:"compression"`
// Timeout is the maximum duration for an export attempt.
Timeout time.Duration `mapstructure:"timeout"`
// Headers are additional HTTP headers sent with every export request.
Headers map[string]string `mapstructure:"headers"`
// Retry configures exponential backoff retry policy for failed exports.
Retry RetryConfig `mapstructure:"retry"`
}
// RetryConfig configures exponential backoff for the OTLP HTTP exporter.
type RetryConfig struct {
// Enabled controls whether retries are attempted on transient failures.
Enabled bool `mapstructure:"enabled"`
// InitialInterval is the initial wait time before the first retry.
InitialInterval time.Duration `mapstructure:"initial_interval"`
// MaxInterval is the upper bound on backoff interval.
MaxInterval time.Duration `mapstructure:"max_interval"`
// MaxElapsedTime is the total maximum time spent retrying.
MaxElapsedTime time.Duration `mapstructure:"max_elapsed_time"`
}
func newConfig() factory.Config {
return Config{
BufferSize: 1000,
BatchSize: 100,
FlushInterval: time.Second,
OTLPHTTP: OTLPHTTPConfig{
Endpoint: "localhost:4318",
URLPath: "/v1/logs",
Compression: "none",
Timeout: 10 * time.Second,
Retry: RetryConfig{
Enabled: true,
InitialInterval: 5 * time.Second,
MaxInterval: 30 * time.Second,
MaxElapsedTime: time.Minute,
},
},
}
}
func NewConfigFactory() factory.ConfigFactory {
return factory.NewConfigFactory(factory.MustNewName("auditor"), newConfig)
}
func (c Config) Validate() error {
if c.BufferSize <= 0 {
return errors.New(errors.TypeInvalidInput, errors.CodeInvalidInput, "auditor::buffer_size must be greater than 0")
}
if c.BatchSize <= 0 {
return errors.New(errors.TypeInvalidInput, errors.CodeInvalidInput, "auditor::batch_size must be greater than 0")
}
if c.FlushInterval <= 0 {
return errors.New(errors.TypeInvalidInput, errors.CodeInvalidInput, "auditor::flush_interval must be greater than 0")
}
if c.BatchSize > c.BufferSize {
return errors.New(errors.TypeInvalidInput, errors.CodeInvalidInput, "auditor::batch_size must not exceed auditor::buffer_size")
}
return nil
}

View File

@@ -0,0 +1,42 @@
package noopauditor
import (
"context"
"github.com/SigNoz/signoz/pkg/auditor"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/types/audittypes"
)
type provider struct {
healthyC chan struct{}
stopC chan struct{}
}
func NewFactory() factory.ProviderFactory[auditor.Auditor, auditor.Config] {
return factory.NewProviderFactory(factory.MustNewName("noop"), New)
}
func New(ctx context.Context, providerSettings factory.ProviderSettings, config auditor.Config) (auditor.Auditor, error) {
return &provider{
healthyC: make(chan struct{}),
stopC: make(chan struct{}),
}, nil
}
func (p *provider) Start(_ context.Context) error {
close(p.healthyC)
<-p.stopC
return nil
}
func (p *provider) Stop(_ context.Context) error {
close(p.stopC)
return nil
}
func (p *provider) Healthy() <-chan struct{} {
return p.healthyC
}
func (p *provider) Audit(_ context.Context, _ audittypes.AuditEvent) {}

View File

@@ -210,7 +210,7 @@ func (a *AuthN) fetchGoogleWorkspaceGroups(ctx context.Context, userEmail string
return a.getGroups(ctx, adminService, userEmail, config.FetchTransitiveGroupMembership, checkedGroups)
}
// Recursive method
// Recursive method.
func (a *AuthN) getGroups(ctx context.Context, adminService *admin.Service, userEmail string, fetchTransitive bool, checkedGroups map[string]struct{}) ([]string, error) {
var userGroups []string
var pageToken string

View File

@@ -14,6 +14,7 @@ import (
"github.com/SigNoz/signoz/pkg/sqlstore"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
openfgapkgtransformer "github.com/openfga/language/pkg/go/transformer"
"github.com/openfga/openfga/pkg/storage"
)
type provider struct {
@@ -21,14 +22,14 @@ type provider struct {
store authtypes.RoleStore
}
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) factory.ProviderFactory[authz.AuthZ, authz.Config] {
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, openfgaDataStore storage.OpenFGADatastore) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return factory.NewProviderFactory(factory.MustNewName("openfga"), func(ctx context.Context, ps factory.ProviderSettings, config authz.Config) (authz.AuthZ, error) {
return newOpenfgaProvider(ctx, ps, config, sqlstore, openfgaSchema)
return newOpenfgaProvider(ctx, ps, config, sqlstore, openfgaSchema, openfgaDataStore)
})
}
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) (authz.AuthZ, error) {
server, err := openfgaserver.NewOpenfgaServer(ctx, settings, config, sqlstore, openfgaSchema)
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, openfgaDataStore storage.OpenFGADatastore) (authz.AuthZ, error) {
server, err := openfgaserver.NewOpenfgaServer(ctx, settings, config, sqlstore, openfgaSchema, openfgaDataStore)
if err != nil {
return nil, err
}

View File

@@ -15,6 +15,7 @@ import (
openfgav1 "github.com/openfga/api/proto/openfga/v1"
openfgapkgtransformer "github.com/openfga/language/pkg/go/transformer"
openfgapkgserver "github.com/openfga/openfga/pkg/server"
"github.com/openfga/openfga/pkg/storage"
"google.golang.org/protobuf/encoding/protojson"
)
@@ -38,18 +39,12 @@ type Server struct {
healthyC chan struct{}
}
func NewOpenfgaServer(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) (*Server, error) {
func NewOpenfgaServer(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, openfgaDataStore storage.OpenFGADatastore) (*Server, error) {
scopedProviderSettings := factory.NewScopedProviderSettings(settings, "github.com/SigNoz/signoz/pkg/authz/openfgaauthz")
store, err := NewSQLStore(sqlstore)
if err != nil {
scopedProviderSettings.Logger().DebugContext(ctx, "failed to initialize sqlstore for authz")
return nil, err
}
// setup the openfga server
opts := []openfgapkgserver.OpenFGAServiceV1Option{
openfgapkgserver.WithDatastore(store),
openfgapkgserver.WithDatastore(openfgaDataStore),
openfgapkgserver.WithLogger(NewLogger(scopedProviderSettings.Logger())),
openfgapkgserver.WithContextPropagationToDatastore(true),
}

View File

@@ -16,22 +16,22 @@ import (
func TestProviderStartStop(t *testing.T) {
providerSettings := instrumentationtest.New().ToProviderSettings()
sqlstore := sqlstoretest.New(sqlstore.Config{Provider: "postgres"}, sqlmock.QueryMatcherRegexp)
sqlstore := sqlstoretest.New(sqlstore.Config{Provider: "sqlite"}, sqlmock.QueryMatcherRegexp)
openfgaDataStore, err := NewSQLStore(sqlstore)
require.NoError(t, err)
expectedModel := `module base
type user`
provider, err := NewOpenfgaServer(context.Background(), providerSettings, authz.Config{}, sqlstore, []transformer.ModuleFile{{Name: "test.fga", Contents: expectedModel}})
provider, err := NewOpenfgaServer(context.Background(), providerSettings, authz.Config{}, sqlstore, []transformer.ModuleFile{{Name: "test.fga", Contents: expectedModel}}, openfgaDataStore)
require.NoError(t, err)
storeRows := sqlstore.Mock().NewRows([]string{"id", "name", "created_at", "updated_at"}).AddRow("01K3V0NTN47MPTMEV1PD5ST6ZC", "signoz", time.Now(), time.Now())
sqlstore.Mock().ExpectQuery("SELECT (.+) FROM store WHERE (.+)").WillReturnRows(storeRows)
authModelCollectionRows := sqlstore.Mock().NewRows([]string{"authorization_model_id"}).AddRow("01K44QQKXR6F729W160NFCJT58")
sqlstore.Mock().ExpectQuery("SELECT DISTINCT (.+) FROM authorization_model WHERE store (.+) ORDER BY (.+)").WillReturnRows(authModelCollectionRows)
modelRows := sqlstore.Mock().NewRows([]string{"authorization_model_id", "schema_version", "type", "type_definition", "serialized_protobuf"}).
AddRow("01K44QQKXR6F729W160NFCJT58", "1.1", "", "", "")
sqlstore.Mock().ExpectQuery("SELECT authorization_model_id, schema_version, type, type_definition, serialized_protobuf FROM authorization_model WHERE authorization_model_id = (.+) AND store = (.+)").WithArgs("01K44QQKXR6F729W160NFCJT58", "01K3V0NTN47MPTMEV1PD5ST6ZC").WillReturnRows(modelRows)
authModelRows := sqlstore.Mock().NewRows([]string{"authorization_model_id", "schema_version", "serialized_protobuf"}).
AddRow("01K44QQKXR6F729W160NFCJT58", "1.1", []byte(""))
sqlstore.Mock().ExpectQuery("SELECT (.+) FROM authorization_model WHERE (.+)").WillReturnRows(authModelRows)
sqlstore.Mock().ExpectExec("INSERT INTO authorization_model (.+) VALUES (.+)").WillReturnResult(sqlmock.NewResult(1, 1))

View File

@@ -4,23 +4,18 @@ import (
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/openfga/openfga/pkg/storage"
"github.com/openfga/openfga/pkg/storage/postgres"
"github.com/openfga/openfga/pkg/storage/sqlcommon"
"github.com/openfga/openfga/pkg/storage/sqlite"
)
func NewSQLStore(sqlstore sqlstore.SQLStore) (storage.OpenFGADatastore, error) {
switch sqlstore.BunDB().Dialect().Name().String() {
func NewSQLStore(store sqlstore.SQLStore) (storage.OpenFGADatastore, error) {
switch store.BunDB().Dialect().Name().String() {
case "sqlite":
return sqlite.NewWithDB(sqlstore.SQLDB(), &sqlcommon.Config{
MaxTuplesPerWriteField: 100,
MaxTypesPerModelField: 100,
})
case "pg":
return postgres.NewWithDB(sqlstore.SQLDB(), nil, &sqlcommon.Config{
return sqlite.NewWithDB(store.SQLDB(), &sqlcommon.Config{
MaxTuplesPerWriteField: 100,
MaxTypesPerModelField: 100,
})
}
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "invalid store type: %s", sqlstore.BunDB().Dialect().Name().String())
return nil, errors.Newf(errors.TypeInvalidInput, errors.CodeInvalidInput, "invalid store type: %s", store.BunDB().Dialect().Name().String())
}

View File

@@ -27,7 +27,7 @@ func NewConf() *Conf {
// NewConfFromMap creates a new Conf instance from a map.
func NewConfFromMap(m map[string]any) (*Conf, error) {
conf := NewConf()
if err := conf.Koanf.Load(confmap.Provider(m, KoanfDelimiter), nil); err != nil {
if err := conf.Load(confmap.Provider(m, KoanfDelimiter), nil); err != nil {
return nil, err
}
@@ -74,7 +74,7 @@ func (conf *Conf) Unmarshal(path string, input any, tags ...string) error {
Result: input,
}
err := conf.Koanf.UnmarshalWithConf(path, input, koanf.UnmarshalConf{Tag: tag, DecoderConfig: dc})
err := conf.UnmarshalWithConf(path, input, koanf.UnmarshalConf{Tag: tag, DecoderConfig: dc})
if err != nil {
return err
}
@@ -93,7 +93,7 @@ func (conf *Conf) Set(key string, input any) error {
}
newConf := NewConf()
if err := newConf.Koanf.Load(confmap.Provider(m, KoanfDelimiter), nil); err != nil {
if err := newConf.Load(confmap.Provider(m, KoanfDelimiter), nil); err != nil {
return err
}

View File

@@ -40,7 +40,7 @@ type WhereClauseRewriter struct {
// In this case, the Severity text will appear in the `labels` if it were part of the group
// by clause, in which case we replace it with the actual value for the notification
// i.e Severity text = WARN
// If the Severity text is not part of the group by clause, then we add it as it is
// If the Severity text is not part of the group by clause, then we add it as it is.
func PrepareFilterExpression(labels map[string]string, whereClause string, groupByItems []qbtypes.GroupByKey) string {
if whereClause == "" && len(labels) == 0 {
return ""
@@ -100,12 +100,12 @@ func PrepareFilterExpression(labels map[string]string, whereClause string, group
return rewrittenClause
}
// Visit implements the visitor for the query rule
// Visit implements the visitor for the query rule.
func (r *WhereClauseRewriter) Visit(tree antlr.ParseTree) interface{} {
return tree.Accept(r)
}
// VisitQuery visits the query node
// VisitQuery visits the query node.
func (r *WhereClauseRewriter) VisitQuery(ctx *parser.QueryContext) interface{} {
if ctx.Expression() != nil {
ctx.Expression().Accept(r)
@@ -113,7 +113,7 @@ func (r *WhereClauseRewriter) VisitQuery(ctx *parser.QueryContext) interface{} {
return nil
}
// VisitExpression visits the expression node
// VisitExpression visits the expression node.
func (r *WhereClauseRewriter) VisitExpression(ctx *parser.ExpressionContext) interface{} {
if ctx.OrExpression() != nil {
ctx.OrExpression().Accept(r)
@@ -121,7 +121,7 @@ func (r *WhereClauseRewriter) VisitExpression(ctx *parser.ExpressionContext) int
return nil
}
// VisitOrExpression visits OR expressions
// VisitOrExpression visits OR expressions.
func (r *WhereClauseRewriter) VisitOrExpression(ctx *parser.OrExpressionContext) interface{} {
for i, andExpr := range ctx.AllAndExpression() {
if i > 0 {
@@ -132,7 +132,7 @@ func (r *WhereClauseRewriter) VisitOrExpression(ctx *parser.OrExpressionContext)
return nil
}
// VisitAndExpression visits AND expressions
// VisitAndExpression visits AND expressions.
func (r *WhereClauseRewriter) VisitAndExpression(ctx *parser.AndExpressionContext) interface{} {
unaryExprs := ctx.AllUnaryExpression()
for i, unaryExpr := range unaryExprs {
@@ -150,7 +150,7 @@ func (r *WhereClauseRewriter) VisitAndExpression(ctx *parser.AndExpressionContex
return nil
}
// VisitUnaryExpression visits unary expressions (with optional NOT)
// VisitUnaryExpression visits unary expressions (with optional NOT).
func (r *WhereClauseRewriter) VisitUnaryExpression(ctx *parser.UnaryExpressionContext) interface{} {
if ctx.NOT() != nil {
r.rewritten.WriteString("NOT ")
@@ -161,7 +161,7 @@ func (r *WhereClauseRewriter) VisitUnaryExpression(ctx *parser.UnaryExpressionCo
return nil
}
// VisitPrimary visits primary expressions
// VisitPrimary visits primary expressions.
func (r *WhereClauseRewriter) VisitPrimary(ctx *parser.PrimaryContext) interface{} {
if ctx.LPAREN() != nil && ctx.RPAREN() != nil {
r.rewritten.WriteString("(")
@@ -183,7 +183,7 @@ func (r *WhereClauseRewriter) VisitPrimary(ctx *parser.PrimaryContext) interface
return nil
}
// VisitComparison visits comparison expressions
// VisitComparison visits comparison expressions.
func (r *WhereClauseRewriter) VisitComparison(ctx *parser.ComparisonContext) interface{} {
if ctx.Key() == nil {
return nil
@@ -304,7 +304,7 @@ func (r *WhereClauseRewriter) VisitComparison(ctx *parser.ComparisonContext) int
return nil
}
// VisitInClause visits IN clauses
// VisitInClause visits IN clauses.
func (r *WhereClauseRewriter) VisitInClause(ctx *parser.InClauseContext) interface{} {
r.rewritten.WriteString("IN ")
if ctx.LPAREN() != nil {
@@ -325,7 +325,7 @@ func (r *WhereClauseRewriter) VisitInClause(ctx *parser.InClauseContext) interfa
return nil
}
// VisitNotInClause visits NOT IN clauses
// VisitNotInClause visits NOT IN clauses.
func (r *WhereClauseRewriter) VisitNotInClause(ctx *parser.NotInClauseContext) interface{} {
r.rewritten.WriteString("NOT IN ")
if ctx.LPAREN() != nil {
@@ -346,7 +346,7 @@ func (r *WhereClauseRewriter) VisitNotInClause(ctx *parser.NotInClauseContext) i
return nil
}
// VisitValueList visits value lists
// VisitValueList visits value lists.
func (r *WhereClauseRewriter) VisitValueList(ctx *parser.ValueListContext) interface{} {
values := ctx.AllValue()
for i, val := range values {
@@ -358,13 +358,13 @@ func (r *WhereClauseRewriter) VisitValueList(ctx *parser.ValueListContext) inter
return nil
}
// VisitFullText visits full text expressions
// VisitFullText visits full text expressions.
func (r *WhereClauseRewriter) VisitFullText(ctx *parser.FullTextContext) interface{} {
r.rewritten.WriteString(ctx.GetText())
return nil
}
// VisitFunctionCall visits function calls
// VisitFunctionCall visits function calls.
func (r *WhereClauseRewriter) VisitFunctionCall(ctx *parser.FunctionCallContext) interface{} {
// Write function name
if ctx.HAS() != nil {
@@ -385,7 +385,7 @@ func (r *WhereClauseRewriter) VisitFunctionCall(ctx *parser.FunctionCallContext)
return nil
}
// VisitFunctionParamList visits function parameter lists
// VisitFunctionParamList visits function parameter lists.
func (r *WhereClauseRewriter) VisitFunctionParamList(ctx *parser.FunctionParamListContext) interface{} {
params := ctx.AllFunctionParam()
for i, param := range params {
@@ -397,7 +397,7 @@ func (r *WhereClauseRewriter) VisitFunctionParamList(ctx *parser.FunctionParamLi
return nil
}
// VisitFunctionParam visits function parameters
// VisitFunctionParam visits function parameters.
func (r *WhereClauseRewriter) VisitFunctionParam(ctx *parser.FunctionParamContext) interface{} {
if ctx.Key() != nil {
ctx.Key().Accept(r)
@@ -409,7 +409,7 @@ func (r *WhereClauseRewriter) VisitFunctionParam(ctx *parser.FunctionParamContex
return nil
}
// VisitArray visits array expressions
// VisitArray visits array expressions.
func (r *WhereClauseRewriter) VisitArray(ctx *parser.ArrayContext) interface{} {
r.rewritten.WriteString("[")
if ctx.ValueList() != nil {
@@ -419,13 +419,13 @@ func (r *WhereClauseRewriter) VisitArray(ctx *parser.ArrayContext) interface{} {
return nil
}
// VisitValue visits value expressions
// VisitValue visits value expressions.
func (r *WhereClauseRewriter) VisitValue(ctx *parser.ValueContext) interface{} {
r.rewritten.WriteString(ctx.GetText())
return nil
}
// VisitKey visits key expressions
// VisitKey visits key expressions.
func (r *WhereClauseRewriter) VisitKey(ctx *parser.KeyContext) interface{} {
r.keysSeen[ctx.GetText()] = struct{}{}
r.rewritten.WriteString(ctx.GetText())
@@ -438,7 +438,7 @@ func (r *WhereClauseRewriter) isKeyInWhereClause(key string) bool {
}
// escapeValueIfNeeded adds single quotes to string values and escapes single quotes within them
// Numeric and boolean values are returned as-is
// Numeric and boolean values are returned as-is.
func escapeValueIfNeeded(value string) string {
// Check if it's a number
if _, err := fmt.Sscanf(value, "%f", new(float64)); err == nil {

View File

@@ -146,7 +146,7 @@ func PrepareLinksToLogs(start, end time.Time, filterItems []v3.FilterItem) strin
// In this case, the Severity text will appear in the `lbls` if it were part of the group
// by clause, in which case we replace it with the actual value for the notification
// i.e Severity text = WARN
// If the Severity text is not part of the group by clause, then we add it as it is
// If the Severity text is not part of the group by clause, then we add it as it is.
func PrepareFilters(labels map[string]string, whereClauseItems []v3.FilterItem, groupByItems []v3.AttributeKey, keys map[string]v3.AttributeKey) []v3.FilterItem {
filterItems := make([]v3.FilterItem, 0)

View File

@@ -5,7 +5,7 @@ import (
"github.com/SigNoz/signoz/pkg/types/ruletypes"
)
// TODO(srikanthccv): Fix the URL management
// TODO(srikanthccv): Fix the URL management.
type URLShareableTimeRange struct {
Start int64 `json:"start"`
End int64 `json:"end"`

View File

@@ -155,7 +155,7 @@ func (b *base) WithAdditional(a ...string) *base {
// Otherwise, it returns TypeInternal, the original error string
// and the error itself.
//
//lint:ignore ST1008 we want to return arguments in the 'TCMEUA' order of the struct
//nolint:staticcheck // ST1008: intentional return order matching struct field order (TCMEUA)
func Unwrapb(cause error) (typ, Code, string, error, string, []string) {
base, ok := cause.(*base)
if ok {

View File

@@ -16,7 +16,7 @@ var (
TypeLicenseUnavailable = typ{"license-unavailable"}
)
// Defines custom error types
// Defines custom error types.
type typ struct{ s string }
func (t typ) String() string {

View File

@@ -115,7 +115,7 @@ func (f *flagger) Boolean(ctx context.Context, flag featuretypes.Name, evalCtx f
for _, client := range f.clients {
value, err := client.BooleanValue(ctx, flag.String(), defaultValue, evalCtx.Ctx())
if err != nil {
f.settings.Logger().ErrorContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Name()))
f.settings.Logger().ErrorContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Domain()))
continue
}
@@ -148,7 +148,7 @@ func (f *flagger) String(ctx context.Context, flag featuretypes.Name, evalCtx fe
for _, client := range f.clients {
value, err := client.StringValue(ctx, flag.String(), defaultValue, evalCtx.Ctx())
if err != nil {
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Name()))
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Domain()))
continue
}
@@ -181,7 +181,7 @@ func (f *flagger) Float(ctx context.Context, flag featuretypes.Name, evalCtx fea
for _, client := range f.clients {
value, err := client.FloatValue(ctx, flag.String(), defaultValue, evalCtx.Ctx())
if err != nil {
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Name()))
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Domain()))
continue
}
@@ -214,7 +214,7 @@ func (f *flagger) Int(ctx context.Context, flag featuretypes.Name, evalCtx featu
for _, client := range f.clients {
value, err := client.IntValue(ctx, flag.String(), defaultValue, evalCtx.Ctx())
if err != nil {
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Name()))
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Domain()))
continue
}
@@ -247,7 +247,7 @@ func (f *flagger) Object(ctx context.Context, flag featuretypes.Name, evalCtx fe
for _, client := range f.clients {
value, err := client.ObjectValue(ctx, flag.String(), defaultValue, evalCtx.Ctx())
if err != nil {
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Name()))
f.settings.Logger().WarnContext(ctx, "failed to get value from client", errors.Attr(err), slog.Any("flag", flag), slog.String("client", client.Metadata().Domain()))
continue
}

Some files were not shown because too many files have changed in this diff Show More