Compare commits

..

24 Commits

Author SHA1 Message Date
Abhi kumar
eba011c2bb Merge branch 'chore/base-config-builder' into feat/bar-panel 2026-02-10 23:04:16 +05:30
Abhi Kumar
6c0843595a feat: added new barpanel component 2026-02-10 23:01:40 +05:30
Abhi kumar
703b221dfe Merge branch 'main' into chore/base-config-builder 2026-02-10 22:28:27 +05:30
Abhi Kumar
9f13086214 chore: removed dayjs extention 2026-02-10 22:23:33 +05:30
Vishal Sharma
a3bd72ad86 feat(onboarding): add Docker support for apm data sources, Convex and simplify PHP flow (#10261)
Some checks are pending
build-staging / prepare (push) Waiting to run
build-staging / js-build (push) Blocked by required conditions
build-staging / go-build (push) Blocked by required conditions
build-staging / staging (push) Blocked by required conditions
Release Drafter / update_release_draft (push) Waiting to run
* feat(onboarding): add Docker support for apm data sources, Convex and simplify PHP flow

- Add Convex logo asset for integration branding
- Add Docker deployment options for Java frameworks (Spring Boot, Tomcat, JBoss, Quarkus)
- Update JBoss configuration to include WildFly support and associated keywords
- Restructure PHP onboarding flow to prioritize environment selection over framework
- Update Java keywords to include JDBC and remove deprecated metrics tag

* chore: remove 'angular' from related keywords in the onboarding configuration
2026-02-10 18:23:24 +05:30
Nikhil Soni
786070d90b refactor: switch group by instead of triggering aggregation (#10263)
Using FINAL in clickhouse query will trigger the aggregation
merge of data while using group by will be more efficient.
It's recommended id docs as well - https://clickhouse.com/
docs/engines/table-engines/mergetree-family/
aggregatingmergetree#select-and-insert
2026-02-10 12:35:31 +00:00
Abhi kumar
20e58db10d Merge branch 'main' into chore/base-config-builder 2026-02-10 16:54:27 +05:30
Vikrant Gupta
e699ad8122 fix(meter): custom step intervals for meter aggregations (#10255)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
* fix(meter): custom step interval support

* fix(meter): custom step interval support

* fix(meter): custom step interval support

* fix(meter): custom step interval support

* fix(meter): remove frontend harcoding for step interval

* fix(meter): remove frontend harcoding for step interval
2026-02-10 16:54:14 +05:30
Abhi Kumar
974bfcd732 chore: added different tooltips 2026-02-10 16:53:49 +05:30
Abhi Kumar
e6d89465da fix: pr review changes 2026-02-10 16:05:04 +05:30
Abhi Kumar
a634ae9b66 fix: pr review changes 2026-02-10 15:48:29 +05:30
Abhi Kumar
bb1f5ba29f chore: tsc fix 2026-02-10 13:31:47 +05:30
Abhi Kumar
30b3a68154 Merge branch 'main' of https://github.com/SigNoz/signoz into chore/base-config-builder 2026-02-10 12:57:31 +05:30
Abhi Kumar
08aa8759ba chore: added a common chart wrapper 2026-02-10 12:57:02 +05:30
Ishan
a1cc05848c feat: added open filter (#10244) 2026-02-10 11:33:17 +05:30
Ederson G. Elias
6d78df2275 fix: Service Map environment filter not working with DOT_METRICS_ENABLED (#10227)
* fix: Service Map environment filter not working with DOT_METRICS_ENABLED

When DOT_METRICS_ENABLED is active, resource attribute keys use dot notation
(e.g. resource_deployment.environment) instead of underscore notation
(e.g. resource_deployment_environment). The whilelistedKeys array only
contained underscore-notation keys, causing getVisibleQueries and
mappingWithRoutesAndKeys to filter out valid queries on the Service Map.

- Add dot-notation variants to whilelistedKeys for environment, k8s cluster
  name, and k8s namespace
- Remove unnecessary onBlur handler from environment Select component
- Add unit tests for whilelistedKeys and mappingWithRoutesAndKeys

Closes #10226

* chore: run perttify

---------

Co-authored-by: srikanthccv <srikanth.chekuri92@gmail.com>
2026-02-10 04:48:31 +00:00
Abhi Kumar
c941233723 chore: refactored the config builder and added base config builder 2026-02-09 21:34:08 +05:30
Ashwin Bhatkal
df49484bea fix: fix flaky dashboard test (#10254)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
2026-02-09 14:43:48 +00:00
Ashwin Bhatkal
72b0f27494 chore: use variable select strategy + (#10245)
* chore: use variable select strategy

* chore: revert custom multi select

* chore: fix tests

* chore: fix tests

* chore: fix flaky test
2026-02-09 14:29:28 +00:00
Abhi kumar
e36b647bc7 test: added test suites for uplotchart component (#10247)
* test: added test suites for uplotchart component

* chore: resolved pr review comments

* chore: resolved pr review comments
2026-02-09 14:15:21 +00:00
Nikhil Soni
b491772eaa fix: ensure trace time range is fetch correctly (#10252)
If multiple batches are inserted with same trace_id, then
trace_summary table can have multiple rows before they are
aggregated by clickhouse. Query to get the time range from
trace_summary was assuming a single which was create
unpredictable behaviour as any random row could be returned.
2026-02-09 19:31:08 +05:30
Ashwin Bhatkal
128497f27a chore: folder name change + CODEOWNER update (#10246)
Some checks failed
build-staging / prepare (push) Has been cancelled
build-staging / js-build (push) Has been cancelled
build-staging / go-build (push) Has been cancelled
build-staging / staging (push) Has been cancelled
Release Drafter / update_release_draft (push) Has been cancelled
* chore: folder name change + CODEOWNER update

* chore: revert multi select file change
2026-02-09 09:40:26 +00:00
Jatinderjit Singh
9e466b56b2 chore: preserve the original duration format (#10149) 2026-02-09 09:24:58 +00:00
Vikrant Gupta
4ad0baa2a2 feat(authz): add support for wildcard selector (#10208)
* feat(authz): remove unnecessary dependency injection for role setter

* feat(authz): deprecate role module

* feat(authz): deprecate role module

* feat(authz): split between server and sql actions

* feat(authz): add bootstrap for managed role transactions

* feat(authz): update and add integration tests

* feat(authz): match names for factory and migration

* feat(authz): fix integration tests

* feat(authz): reduce calls on organisation creeation
2026-02-09 14:37:44 +05:30
133 changed files with 4462 additions and 4497 deletions

7
.github/CODEOWNERS vendored
View File

@@ -133,5 +133,8 @@
/frontend/src/pages/PublicDashboard/ @SigNoz/pulse-frontend
/frontend/src/container/PublicDashboardContainer/ @SigNoz/pulse-frontend
## UplotV2
/frontend/src/lib/uPlotV2/ @SigNoz/pulse-frontend
## Dashboard Libs + Components
/frontend/src/lib/uPlotV2/ @SigNoz/pulse-frontend
/frontend/src/lib/dashboard/ @SigNoz/pulse-frontend
/frontend/src/lib/dashboardVariables/ @SigNoz/pulse-frontend
/frontend/src/components/NewSelect/ @SigNoz/pulse-frontend

View File

@@ -18,8 +18,6 @@ import (
"github.com/SigNoz/signoz/pkg/modules/dashboard"
"github.com/SigNoz/signoz/pkg/modules/dashboard/impldashboard"
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/modules/role/implrole"
"github.com/SigNoz/signoz/pkg/querier"
"github.com/SigNoz/signoz/pkg/query-service/app"
"github.com/SigNoz/signoz/pkg/queryparser"
@@ -78,18 +76,15 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
func(ctx context.Context, providerSettings factory.ProviderSettings, store authtypes.AuthNStore, licensing licensing.Licensing) (map[authtypes.AuthNProvider]authn.AuthN, error) {
return signoz.NewAuthNs(ctx, providerSettings, store, licensing)
},
func(ctx context.Context, sqlstore sqlstore.SQLStore) factory.ProviderFactory[authz.AuthZ, authz.Config] {
func(ctx context.Context, sqlstore sqlstore.SQLStore, _ licensing.Licensing, _ dashboard.Module) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx))
},
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, _ role.Setter, _ role.Granter, queryParser queryparser.QueryParser, _ querier.Querier, _ licensing.Licensing) dashboard.Module {
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, queryParser queryparser.QueryParser, _ querier.Querier, _ licensing.Licensing) dashboard.Module {
return impldashboard.NewModule(impldashboard.NewStore(store), settings, analytics, orgGetter, queryParser)
},
func(_ licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
return noopgateway.NewProviderFactory()
},
func(store sqlstore.SQLStore, authz authz.AuthZ, licensing licensing.Licensing, _ []role.RegisterTypeable) role.Setter {
return implrole.NewSetter(implrole.NewStore(store), authz)
},
)
if err != nil {
logger.ErrorContext(ctx, "failed to create signoz", "error", err)

View File

@@ -14,7 +14,6 @@ import (
enterpriselicensing "github.com/SigNoz/signoz/ee/licensing"
"github.com/SigNoz/signoz/ee/licensing/httplicensing"
"github.com/SigNoz/signoz/ee/modules/dashboard/impldashboard"
"github.com/SigNoz/signoz/ee/modules/role/implrole"
enterpriseapp "github.com/SigNoz/signoz/ee/query-service/app"
"github.com/SigNoz/signoz/ee/sqlschema/postgressqlschema"
"github.com/SigNoz/signoz/ee/sqlstore/postgressqlstore"
@@ -29,8 +28,6 @@ import (
"github.com/SigNoz/signoz/pkg/modules/dashboard"
pkgimpldashboard "github.com/SigNoz/signoz/pkg/modules/dashboard/impldashboard"
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/role"
pkgimplrole "github.com/SigNoz/signoz/pkg/modules/role/implrole"
"github.com/SigNoz/signoz/pkg/querier"
"github.com/SigNoz/signoz/pkg/queryparser"
"github.com/SigNoz/signoz/pkg/signoz"
@@ -118,18 +115,15 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
return authNs, nil
},
func(ctx context.Context, sqlstore sqlstore.SQLStore) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx))
func(ctx context.Context, sqlstore sqlstore.SQLStore, licensing licensing.Licensing, dashboardModule dashboard.Module) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return openfgaauthz.NewProviderFactory(sqlstore, openfgaschema.NewSchema().Get(ctx), licensing, dashboardModule)
},
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, roleSetter role.Setter, granter role.Granter, queryParser queryparser.QueryParser, querier querier.Querier, licensing licensing.Licensing) dashboard.Module {
return impldashboard.NewModule(pkgimpldashboard.NewStore(store), settings, analytics, orgGetter, roleSetter, granter, queryParser, querier, licensing)
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, queryParser queryparser.QueryParser, querier querier.Querier, licensing licensing.Licensing) dashboard.Module {
return impldashboard.NewModule(pkgimpldashboard.NewStore(store), settings, analytics, orgGetter, queryParser, querier, licensing)
},
func(licensing licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
return httpgateway.NewProviderFactory(licensing)
},
func(store sqlstore.SQLStore, authz authz.AuthZ, licensing licensing.Licensing, registry []role.RegisterTypeable) role.Setter {
return implrole.NewSetter(pkgimplrole.NewStore(store), authz, licensing, registry)
},
)
if err != nil {

View File

@@ -616,11 +616,11 @@ paths:
- in: query
name: signal
schema:
$ref: '#/components/schemas/TelemetrytypesSignal'
type: string
- in: query
name: source
schema:
$ref: '#/components/schemas/TelemetrytypesSource'
type: string
- in: query
name: limit
schema:
@@ -638,11 +638,11 @@ paths:
- in: query
name: fieldContext
schema:
$ref: '#/components/schemas/TelemetrytypesFieldContext'
type: string
- in: query
name: fieldDataType
schema:
$ref: '#/components/schemas/TelemetrytypesFieldDataType'
type: string
- in: query
name: metricName
schema:
@@ -698,11 +698,11 @@ paths:
- in: query
name: signal
schema:
$ref: '#/components/schemas/TelemetrytypesSignal'
type: string
- in: query
name: source
schema:
$ref: '#/components/schemas/TelemetrytypesSource'
type: string
- in: query
name: limit
schema:
@@ -720,11 +720,11 @@ paths:
- in: query
name: fieldContext
schema:
$ref: '#/components/schemas/TelemetrytypesFieldContext'
type: string
- in: query
name: fieldDataType
schema:
$ref: '#/components/schemas/TelemetrytypesFieldDataType'
type: string
- in: query
name: metricName
schema:
@@ -3526,62 +3526,6 @@ paths:
summary: Rotate session
tags:
- sessions
/api/v5/query_range:
post:
deprecated: false
description: Execute a composite query over a time range. Supports builder queries
(traces, logs, metrics), formulas, trace operators, PromQL, and ClickHouse
SQL.
operationId: QueryRangeV5
requestBody:
content:
application/json:
schema:
$ref: '#/components/schemas/Querybuildertypesv5QueryRangeRequest'
responses:
"200":
content:
application/json:
schema:
properties:
data:
$ref: '#/components/schemas/Querybuildertypesv5QueryRangeResponse'
status:
type: string
type: object
description: OK
"400":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Bad Request
"401":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Unauthorized
"403":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Forbidden
"500":
content:
application/json:
schema:
$ref: '#/components/schemas/RenderErrorResponse'
description: Internal Server Error
security:
- api_key:
- VIEWER
- tokenizer:
- VIEWER
summary: Query range
tags:
- query
components:
schemas:
AuthtypesAttributeMapping:
@@ -4305,39 +4249,12 @@ components:
- temporality
- isMonotonic
type: object
MetrictypesSpaceAggregation:
enum:
- ""
- sum
- avg
- min
- max
- count
- p50
- p75
- p90
- p95
- p99
type: string
MetrictypesTemporality:
enum:
- delta
- cumulative
- unspecified
type: string
MetrictypesTimeAggregation:
enum:
- ""
- latest
- sum
- avg
- min
- max
- count
- count_distinct
- rate
- increase
type: string
MetrictypesType:
enum:
- gauge
@@ -4395,101 +4312,7 @@ components:
type:
type: string
type: object
Querybuildertypesv5AggregationBucket:
properties:
alias:
type: string
anomalyScores:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeries'
type: array
index:
type: integer
lowerBoundSeries:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeries'
type: array
meta:
properties:
unit:
type: string
type: object
predictedSeries:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeries'
type: array
series:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeries'
nullable: true
type: array
upperBoundSeries:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeries'
type: array
type: object
Querybuildertypesv5Bucket:
properties:
step:
format: double
type: number
type: object
Querybuildertypesv5ClickHouseQuery:
properties:
disabled:
type: boolean
legend:
type: string
name:
type: string
query:
type: string
type: object
Querybuildertypesv5ColumnDescriptor:
properties:
aggregationIndex:
format: int64
type: integer
columnType:
$ref: '#/components/schemas/Querybuildertypesv5ColumnType'
description:
type: string
fieldContext:
$ref: '#/components/schemas/TelemetrytypesFieldContext'
fieldDataType:
$ref: '#/components/schemas/TelemetrytypesFieldDataType'
meta:
properties:
unit:
type: string
type: object
name:
type: string
queryName:
type: string
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
unit:
type: string
type: object
Querybuildertypesv5ColumnType:
enum:
- group
- aggregation
type: string
Querybuildertypesv5CompositeQuery:
description: Composite query containing one or more query envelopes. Each query
envelope specifies its type and corresponding spec.
properties:
queries:
items:
$ref: '#/components/schemas/Querybuildertypesv5QueryEnvelope'
nullable: true
type: array
type: object
Querybuildertypesv5ExecStats:
description: Execution statistics for the query, including rows scanned, bytes
scanned, and duration.
properties:
bytesScanned:
minimum: 0
@@ -4511,109 +4334,10 @@ components:
expression:
type: string
type: object
Querybuildertypesv5FormatOptions:
properties:
fillGaps:
type: boolean
formatTableResultForUI:
type: boolean
type: object
Querybuildertypesv5Function:
properties:
args:
items:
$ref: '#/components/schemas/Querybuildertypesv5FunctionArg'
type: array
name:
$ref: '#/components/schemas/Querybuildertypesv5FunctionName'
type: object
Querybuildertypesv5FunctionArg:
properties:
name:
type: string
value: {}
type: object
Querybuildertypesv5FunctionName:
enum:
- cutoffmin
- cutoffmax
- clampmin
- clampmax
- absolute
- runningdiff
- log2
- log10
- cumulativesum
- ewma3
- ewma5
- ewma7
- median3
- median5
- median7
- timeshift
- anomaly
- fillzero
type: string
Querybuildertypesv5GroupByKey:
properties:
description:
type: string
fieldContext:
$ref: '#/components/schemas/TelemetrytypesFieldContext'
fieldDataType:
$ref: '#/components/schemas/TelemetrytypesFieldDataType'
name:
type: string
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
unit:
type: string
type: object
Querybuildertypesv5Having:
properties:
expression:
type: string
type: object
Querybuildertypesv5Label:
properties:
key:
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldKey'
value: {}
type: object
Querybuildertypesv5LimitBy:
properties:
keys:
items:
type: string
nullable: true
type: array
value:
type: string
type: object
Querybuildertypesv5LogAggregation:
properties:
alias:
type: string
expression:
type: string
type: object
Querybuildertypesv5MetricAggregation:
properties:
metricName:
type: string
reduceTo:
$ref: '#/components/schemas/Querybuildertypesv5ReduceTo'
spaceAggregation:
$ref: '#/components/schemas/MetrictypesSpaceAggregation'
temporality:
$ref: '#/components/schemas/MetrictypesTemporality'
timeAggregation:
$ref: '#/components/schemas/MetrictypesTimeAggregation'
type: object
Querybuildertypesv5OrderBy:
properties:
direction:
$ref: '#/components/schemas/Querybuildertypesv5OrderDirection'
type: string
key:
$ref: '#/components/schemas/Querybuildertypesv5OrderByKey'
type: object
@@ -4622,404 +4346,34 @@ components:
description:
type: string
fieldContext:
$ref: '#/components/schemas/TelemetrytypesFieldContext'
type: string
fieldDataType:
$ref: '#/components/schemas/TelemetrytypesFieldDataType'
type: string
name:
type: string
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
type: string
unit:
type: string
type: object
Querybuildertypesv5OrderDirection:
enum:
- asc
- desc
type: string
Querybuildertypesv5PromQuery:
properties:
disabled:
type: boolean
legend:
type: string
name:
type: string
query:
type: string
stats:
type: boolean
step:
$ref: '#/components/schemas/Querybuildertypesv5Step'
type: object
Querybuildertypesv5QueryBuilderFormula:
properties:
disabled:
type: boolean
expression:
type: string
functions:
items:
$ref: '#/components/schemas/Querybuildertypesv5Function'
type: array
having:
$ref: '#/components/schemas/Querybuildertypesv5Having'
legend:
type: string
limit:
type: integer
name:
type: string
order:
items:
$ref: '#/components/schemas/Querybuildertypesv5OrderBy'
type: array
type: object
Querybuildertypesv5QueryBuilderQueryGithubComSigNozSignozPkgTypesQuerybuildertypesQuerybuildertypesv5LogAggregation:
properties:
aggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5LogAggregation'
type: array
cursor:
type: string
disabled:
type: boolean
filter:
$ref: '#/components/schemas/Querybuildertypesv5Filter'
functions:
items:
$ref: '#/components/schemas/Querybuildertypesv5Function'
type: array
groupBy:
items:
$ref: '#/components/schemas/Querybuildertypesv5GroupByKey'
type: array
having:
$ref: '#/components/schemas/Querybuildertypesv5Having'
legend:
type: string
limit:
type: integer
limitBy:
$ref: '#/components/schemas/Querybuildertypesv5LimitBy'
name:
type: string
offset:
type: integer
order:
items:
$ref: '#/components/schemas/Querybuildertypesv5OrderBy'
type: array
secondaryAggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5SecondaryAggregation'
type: array
selectFields:
items:
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldKey'
type: array
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
source:
$ref: '#/components/schemas/TelemetrytypesSource'
stepInterval:
$ref: '#/components/schemas/Querybuildertypesv5Step'
type: object
Querybuildertypesv5QueryBuilderQueryGithubComSigNozSignozPkgTypesQuerybuildertypesQuerybuildertypesv5MetricAggregation:
properties:
aggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5MetricAggregation'
type: array
cursor:
type: string
disabled:
type: boolean
filter:
$ref: '#/components/schemas/Querybuildertypesv5Filter'
functions:
items:
$ref: '#/components/schemas/Querybuildertypesv5Function'
type: array
groupBy:
items:
$ref: '#/components/schemas/Querybuildertypesv5GroupByKey'
type: array
having:
$ref: '#/components/schemas/Querybuildertypesv5Having'
legend:
type: string
limit:
type: integer
limitBy:
$ref: '#/components/schemas/Querybuildertypesv5LimitBy'
name:
type: string
offset:
type: integer
order:
items:
$ref: '#/components/schemas/Querybuildertypesv5OrderBy'
type: array
secondaryAggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5SecondaryAggregation'
type: array
selectFields:
items:
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldKey'
type: array
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
source:
$ref: '#/components/schemas/TelemetrytypesSource'
stepInterval:
$ref: '#/components/schemas/Querybuildertypesv5Step'
type: object
Querybuildertypesv5QueryBuilderQueryGithubComSigNozSignozPkgTypesQuerybuildertypesQuerybuildertypesv5TraceAggregation:
properties:
aggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5TraceAggregation'
type: array
cursor:
type: string
disabled:
type: boolean
filter:
$ref: '#/components/schemas/Querybuildertypesv5Filter'
functions:
items:
$ref: '#/components/schemas/Querybuildertypesv5Function'
type: array
groupBy:
items:
$ref: '#/components/schemas/Querybuildertypesv5GroupByKey'
type: array
having:
$ref: '#/components/schemas/Querybuildertypesv5Having'
legend:
type: string
limit:
type: integer
limitBy:
$ref: '#/components/schemas/Querybuildertypesv5LimitBy'
name:
type: string
offset:
type: integer
order:
items:
$ref: '#/components/schemas/Querybuildertypesv5OrderBy'
type: array
secondaryAggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5SecondaryAggregation'
type: array
selectFields:
items:
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldKey'
type: array
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
source:
$ref: '#/components/schemas/TelemetrytypesSource'
stepInterval:
$ref: '#/components/schemas/Querybuildertypesv5Step'
type: object
Querybuildertypesv5QueryBuilderTraceOperator:
properties:
aggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5TraceAggregation'
type: array
cursor:
type: string
disabled:
type: boolean
expression:
type: string
filter:
$ref: '#/components/schemas/Querybuildertypesv5Filter'
functions:
items:
$ref: '#/components/schemas/Querybuildertypesv5Function'
type: array
groupBy:
items:
$ref: '#/components/schemas/Querybuildertypesv5GroupByKey'
type: array
having:
$ref: '#/components/schemas/Querybuildertypesv5Having'
legend:
type: string
limit:
type: integer
name:
type: string
offset:
type: integer
order:
items:
$ref: '#/components/schemas/Querybuildertypesv5OrderBy'
type: array
returnSpansFrom:
type: string
selectFields:
items:
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldKey'
type: array
stepInterval:
$ref: '#/components/schemas/Querybuildertypesv5Step'
type: object
Querybuildertypesv5QueryData:
oneOf:
- $ref: '#/components/schemas/Querybuildertypesv5TimeSeriesData'
- $ref: '#/components/schemas/Querybuildertypesv5ScalarData'
- $ref: '#/components/schemas/Querybuildertypesv5RawData'
properties:
results:
items: {}
nullable: true
type: array
type: object
Querybuildertypesv5QueryEnvelope:
oneOf:
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopeBuilderTrace'
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopeBuilderLog'
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopeBuilderMetric'
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopeFormula'
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopeTraceOperator'
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopePromQL'
- $ref: '#/components/schemas/Querybuildertypesv5QueryEnvelopeClickHouseSQL'
properties:
spec: {}
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopeBuilderLog:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5QueryBuilderQueryGithubComSigNozSignozPkgTypesQuerybuildertypesQuerybuildertypesv5LogAggregation'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopeBuilderMetric:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5QueryBuilderQueryGithubComSigNozSignozPkgTypesQuerybuildertypesQuerybuildertypesv5MetricAggregation'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopeBuilderTrace:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5QueryBuilderQueryGithubComSigNozSignozPkgTypesQuerybuildertypesQuerybuildertypesv5TraceAggregation'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopeClickHouseSQL:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5ClickHouseQuery'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopeFormula:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5QueryBuilderFormula'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopePromQL:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5PromQuery'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryEnvelopeTraceOperator:
properties:
spec:
$ref: '#/components/schemas/Querybuildertypesv5QueryBuilderTraceOperator'
type:
$ref: '#/components/schemas/Querybuildertypesv5QueryType'
type: object
Querybuildertypesv5QueryRangeRequest:
description: Request body for the v5 query range endpoint. Supports builder
queries (traces, logs, metrics), formulas, joins, trace operators, PromQL,
and ClickHouse SQL queries.
example:
compositeQuery:
queries:
- spec:
aggregations:
- alias: span_count
expression: count()
filter:
expression: service.name = 'frontend'
groupBy:
- fieldContext: resource
name: service.name
limit: 10
name: A
order:
- direction: desc
key:
name: span_count
signal: traces
stepInterval: 60s
type: builder_query
end: 1.6409988e+12
requestType: time_series
schemaVersion: v1
start: 1.6409952e+12
properties:
compositeQuery:
$ref: '#/components/schemas/Querybuildertypesv5CompositeQuery'
end:
minimum: 0
type: integer
formatOptions:
$ref: '#/components/schemas/Querybuildertypesv5FormatOptions'
noCache:
type: boolean
requestType:
$ref: '#/components/schemas/Querybuildertypesv5RequestType'
schemaVersion:
type: string
start:
minimum: 0
type: integer
variables:
additionalProperties:
$ref: '#/components/schemas/Querybuildertypesv5VariableItem'
type: object
type: object
Querybuildertypesv5QueryRangeResponse:
description: 'Response from the v5 query range endpoint. The data.results array
contains typed results depending on the requestType: TimeSeriesData for time_series,
ScalarData for scalar, or RawData for raw requests.'
properties:
data:
$ref: '#/components/schemas/Querybuildertypesv5QueryData'
meta:
$ref: '#/components/schemas/Querybuildertypesv5ExecStats'
type:
$ref: '#/components/schemas/Querybuildertypesv5RequestType'
type: string
warning:
$ref: '#/components/schemas/Querybuildertypesv5QueryWarnData'
type: object
Querybuildertypesv5QueryType:
enum:
- builder_query
- builder_formula
- builder_trace_operator
- clickhouse_sql
- promql
type: string
Querybuildertypesv5QueryWarnData:
properties:
message:
@@ -5036,153 +4390,6 @@ components:
message:
type: string
type: object
Querybuildertypesv5RawData:
properties:
nextCursor:
type: string
queryName:
type: string
rows:
items:
$ref: '#/components/schemas/Querybuildertypesv5RawRow'
nullable: true
type: array
type: object
Querybuildertypesv5RawRow:
properties:
data:
additionalProperties: {}
nullable: true
type: object
timestamp:
format: date-time
type: string
type: object
Querybuildertypesv5ReduceTo:
enum:
- sum
- count
- avg
- min
- max
- last
- median
type: string
Querybuildertypesv5RequestType:
enum:
- scalar
- time_series
- raw
- raw_stream
- trace
type: string
Querybuildertypesv5ScalarData:
properties:
columns:
items:
$ref: '#/components/schemas/Querybuildertypesv5ColumnDescriptor'
nullable: true
type: array
data:
items:
items: {}
type: array
nullable: true
type: array
queryName:
type: string
type: object
Querybuildertypesv5SecondaryAggregation:
properties:
alias:
type: string
expression:
type: string
groupBy:
items:
$ref: '#/components/schemas/Querybuildertypesv5GroupByKey'
type: array
limit:
type: integer
limitBy:
$ref: '#/components/schemas/Querybuildertypesv5LimitBy'
order:
items:
$ref: '#/components/schemas/Querybuildertypesv5OrderBy'
type: array
stepInterval:
$ref: '#/components/schemas/Querybuildertypesv5Step'
type: object
Querybuildertypesv5Step:
description: Step interval. Accepts a Go duration string (e.g., "60s", "1m",
"1h") or a number representing seconds (e.g., 60).
oneOf:
- description: Duration string (e.g., "60s", "5m", "1h").
example: 60s
type: string
- description: Duration in seconds.
example: 60
type: number
Querybuildertypesv5TimeSeries:
properties:
labels:
items:
$ref: '#/components/schemas/Querybuildertypesv5Label'
type: array
values:
items:
$ref: '#/components/schemas/Querybuildertypesv5TimeSeriesValue'
nullable: true
type: array
type: object
Querybuildertypesv5TimeSeriesData:
properties:
aggregations:
items:
$ref: '#/components/schemas/Querybuildertypesv5AggregationBucket'
nullable: true
type: array
queryName:
type: string
type: object
Querybuildertypesv5TimeSeriesValue:
properties:
bucket:
$ref: '#/components/schemas/Querybuildertypesv5Bucket'
partial:
type: boolean
timestamp:
format: int64
type: integer
value:
format: double
type: number
values:
items:
format: double
type: number
type: array
type: object
Querybuildertypesv5TraceAggregation:
properties:
alias:
type: string
expression:
type: string
type: object
Querybuildertypesv5VariableItem:
properties:
type:
$ref: '#/components/schemas/Querybuildertypesv5VariableType'
value: {}
type: object
Querybuildertypesv5VariableType:
enum:
- query
- dynamic
- custom
- text
type: string
RenderErrorResponse:
properties:
error:
@@ -5209,23 +4416,6 @@ components:
format: date-time
type: string
type: object
TelemetrytypesFieldContext:
enum:
- metric
- log
- span
- resource
- attribute
- body
type: string
TelemetrytypesFieldDataType:
enum:
- string
- bool
- float64
- int64
- number
type: string
TelemetrytypesGettableFieldKeys:
properties:
complete:
@@ -5245,28 +4435,18 @@ components:
values:
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldValues'
type: object
TelemetrytypesSignal:
enum:
- traces
- logs
- metrics
type: string
TelemetrytypesSource:
enum:
- meter
type: string
TelemetrytypesTelemetryFieldKey:
properties:
description:
type: string
fieldContext:
$ref: '#/components/schemas/TelemetrytypesFieldContext'
type: string
fieldDataType:
$ref: '#/components/schemas/TelemetrytypesFieldDataType'
type: string
name:
type: string
signal:
$ref: '#/components/schemas/TelemetrytypesSignal'
type: string
unit:
type: string
type: object

View File

@@ -2,12 +2,18 @@ package openfgaauthz
import (
"context"
"slices"
"github.com/SigNoz/signoz/ee/authz/openfgaserver"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/authz/authzstore/sqlauthzstore"
pkgopenfgaauthz "github.com/SigNoz/signoz/pkg/authz/openfgaauthz"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/licensing"
"github.com/SigNoz/signoz/pkg/sqlstore"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
openfgapkgtransformer "github.com/openfga/language/pkg/go/transformer"
@@ -15,84 +21,320 @@ import (
type provider struct {
pkgAuthzService authz.AuthZ
openfgaServer *openfgaserver.Server
licensing licensing.Licensing
store roletypes.Store
registry []authz.RegisterTypeable
}
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) factory.ProviderFactory[authz.AuthZ, authz.Config] {
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, licensing licensing.Licensing, registry ...authz.RegisterTypeable) factory.ProviderFactory[authz.AuthZ, authz.Config] {
return factory.NewProviderFactory(factory.MustNewName("openfga"), func(ctx context.Context, ps factory.ProviderSettings, config authz.Config) (authz.AuthZ, error) {
return newOpenfgaProvider(ctx, ps, config, sqlstore, openfgaSchema)
return newOpenfgaProvider(ctx, ps, config, sqlstore, openfgaSchema, licensing, registry)
})
}
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) (authz.AuthZ, error) {
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile, licensing licensing.Licensing, registry []authz.RegisterTypeable) (authz.AuthZ, error) {
pkgOpenfgaAuthzProvider := pkgopenfgaauthz.NewProviderFactory(sqlstore, openfgaSchema)
pkgAuthzService, err := pkgOpenfgaAuthzProvider.New(ctx, settings, config)
if err != nil {
return nil, err
}
openfgaServer, err := openfgaserver.NewOpenfgaServer(ctx, pkgAuthzService)
if err != nil {
return nil, err
}
return &provider{
pkgAuthzService: pkgAuthzService,
openfgaServer: openfgaServer,
licensing: licensing,
store: sqlauthzstore.NewSqlAuthzStore(sqlstore),
registry: registry,
}, nil
}
func (provider *provider) Start(ctx context.Context) error {
return provider.pkgAuthzService.Start(ctx)
return provider.openfgaServer.Start(ctx)
}
func (provider *provider) Stop(ctx context.Context) error {
return provider.pkgAuthzService.Stop(ctx)
return provider.openfgaServer.Stop(ctx)
}
func (provider *provider) Check(ctx context.Context, tuple *openfgav1.TupleKey) error {
return provider.pkgAuthzService.Check(ctx, tuple)
return provider.openfgaServer.Check(ctx, tuple)
}
func (provider *provider) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, _ []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableUser, claims.UserID, orgID, nil)
if err != nil {
return err
}
tuples, err := typeable.Tuples(subject, relation, selectors, orgID)
if err != nil {
return err
}
err = provider.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
func (provider *provider) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, roleSelectors []authtypes.Selector) error {
return provider.openfgaServer.CheckWithTupleCreation(ctx, claims, orgID, relation, typeable, selectors, roleSelectors)
}
func (provider *provider) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, _ []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.String(), orgID, nil)
if err != nil {
return err
}
tuples, err := typeable.Tuples(subject, relation, selectors, orgID)
if err != nil {
return err
}
err = provider.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
func (provider *provider) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, roleSelectors []authtypes.Selector) error {
return provider.openfgaServer.CheckWithTupleCreationWithoutClaims(ctx, orgID, relation, typeable, selectors, roleSelectors)
}
func (provider *provider) BatchCheck(ctx context.Context, tuples []*openfgav1.TupleKey) error {
return provider.pkgAuthzService.BatchCheck(ctx, tuples)
return provider.openfgaServer.BatchCheck(ctx, tuples)
}
func (provider *provider) ListObjects(ctx context.Context, subject string, relation authtypes.Relation, typeable authtypes.Typeable) ([]*authtypes.Object, error) {
return provider.pkgAuthzService.ListObjects(ctx, subject, relation, typeable)
return provider.openfgaServer.ListObjects(ctx, subject, relation, typeable)
}
func (provider *provider) Write(ctx context.Context, additions []*openfgav1.TupleKey, deletions []*openfgav1.TupleKey) error {
return provider.pkgAuthzService.Write(ctx, additions, deletions)
return provider.openfgaServer.Write(ctx, additions, deletions)
}
func (provider *provider) Get(ctx context.Context, orgID valuer.UUID, id valuer.UUID) (*roletypes.Role, error) {
return provider.pkgAuthzService.Get(ctx, orgID, id)
}
func (provider *provider) GetByOrgIDAndName(ctx context.Context, orgID valuer.UUID, name string) (*roletypes.Role, error) {
return provider.pkgAuthzService.GetByOrgIDAndName(ctx, orgID, name)
}
func (provider *provider) List(ctx context.Context, orgID valuer.UUID) ([]*roletypes.Role, error) {
return provider.pkgAuthzService.List(ctx, orgID)
}
func (provider *provider) ListByOrgIDAndNames(ctx context.Context, orgID valuer.UUID, names []string) ([]*roletypes.Role, error) {
return provider.pkgAuthzService.ListByOrgIDAndNames(ctx, orgID, names)
}
func (provider *provider) Grant(ctx context.Context, orgID valuer.UUID, name string, subject string) error {
return provider.pkgAuthzService.Grant(ctx, orgID, name, subject)
}
func (provider *provider) ModifyGrant(ctx context.Context, orgID valuer.UUID, existingRoleName string, updatedRoleName string, subject string) error {
return provider.pkgAuthzService.ModifyGrant(ctx, orgID, existingRoleName, updatedRoleName, subject)
}
func (provider *provider) Revoke(ctx context.Context, orgID valuer.UUID, name string, subject string) error {
return provider.pkgAuthzService.Revoke(ctx, orgID, name, subject)
}
func (provider *provider) CreateManagedRoles(ctx context.Context, orgID valuer.UUID, managedRoles []*roletypes.Role) error {
return provider.pkgAuthzService.CreateManagedRoles(ctx, orgID, managedRoles)
}
func (provider *provider) CreateManagedUserRoleTransactions(ctx context.Context, orgID valuer.UUID, userID valuer.UUID) error {
tuples := make([]*openfgav1.TupleKey, 0)
grantTuples, err := provider.getManagedRoleGrantTuples(orgID, userID)
if err != nil {
return err
}
tuples = append(tuples, grantTuples...)
managedRoleTuples, err := provider.getManagedRoleTransactionTuples(orgID)
if err != nil {
return err
}
tuples = append(tuples, managedRoleTuples...)
return provider.Write(ctx, tuples, nil)
}
func (provider *provider) Create(ctx context.Context, orgID valuer.UUID, role *roletypes.Role) error {
_, err := provider.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
return provider.store.Create(ctx, roletypes.NewStorableRoleFromRole(role))
}
func (provider *provider) GetOrCreate(ctx context.Context, orgID valuer.UUID, role *roletypes.Role) (*roletypes.Role, error) {
_, err := provider.licensing.GetActive(ctx, orgID)
if err != nil {
return nil, errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
existingRole, err := provider.store.GetByOrgIDAndName(ctx, role.OrgID, role.Name)
if err != nil {
if !errors.Ast(err, errors.TypeNotFound) {
return nil, err
}
}
if existingRole != nil {
return roletypes.NewRoleFromStorableRole(existingRole), nil
}
err = provider.store.Create(ctx, roletypes.NewStorableRoleFromRole(role))
if err != nil {
return nil, err
}
return role, nil
}
func (provider *provider) GetResources(_ context.Context) []*authtypes.Resource {
typeables := make([]authtypes.Typeable, 0)
for _, register := range provider.registry {
typeables = append(typeables, register.MustGetTypeables()...)
}
// role module cannot self register itself!
typeables = append(typeables, provider.MustGetTypeables()...)
resources := make([]*authtypes.Resource, 0)
for _, typeable := range typeables {
resources = append(resources, &authtypes.Resource{Name: typeable.Name(), Type: typeable.Type()})
}
return resources
}
func (provider *provider) GetObjects(ctx context.Context, orgID valuer.UUID, id valuer.UUID, relation authtypes.Relation) ([]*authtypes.Object, error) {
storableRole, err := provider.store.Get(ctx, orgID, id)
if err != nil {
return nil, err
}
objects := make([]*authtypes.Object, 0)
for _, resource := range provider.GetResources(ctx) {
if slices.Contains(authtypes.TypeableRelations[resource.Type], relation) {
resourceObjects, err := provider.
ListObjects(
ctx,
authtypes.MustNewSubject(authtypes.TypeableRole, storableRole.ID.String(), orgID, &authtypes.RelationAssignee),
relation,
authtypes.MustNewTypeableFromType(resource.Type, resource.Name),
)
if err != nil {
return nil, err
}
objects = append(objects, resourceObjects...)
}
}
return objects, nil
}
func (provider *provider) Patch(ctx context.Context, orgID valuer.UUID, role *roletypes.Role) error {
_, err := provider.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
return provider.store.Update(ctx, orgID, roletypes.NewStorableRoleFromRole(role))
}
func (provider *provider) PatchObjects(ctx context.Context, orgID valuer.UUID, name string, relation authtypes.Relation, additions, deletions []*authtypes.Object) error {
_, err := provider.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
additionTuples, err := roletypes.GetAdditionTuples(name, orgID, relation, additions)
if err != nil {
return err
}
deletionTuples, err := roletypes.GetDeletionTuples(name, orgID, relation, deletions)
if err != nil {
return err
}
err = provider.Write(ctx, additionTuples, deletionTuples)
if err != nil {
return err
}
return nil
}
func (provider *provider) Delete(ctx context.Context, orgID valuer.UUID, id valuer.UUID) error {
_, err := provider.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
storableRole, err := provider.store.Get(ctx, orgID, id)
if err != nil {
return err
}
role := roletypes.NewRoleFromStorableRole(storableRole)
err = role.CanEditDelete()
if err != nil {
return err
}
return provider.store.Delete(ctx, orgID, id)
}
func (provider *provider) MustGetTypeables() []authtypes.Typeable {
return []authtypes.Typeable{authtypes.TypeableRole, roletypes.TypeableResourcesRoles}
}
func (provider *provider) getManagedRoleGrantTuples(orgID valuer.UUID, userID valuer.UUID) ([]*openfgav1.TupleKey, error) {
tuples := []*openfgav1.TupleKey{}
// Grant the admin role to the user
adminSubject := authtypes.MustNewSubject(authtypes.TypeableUser, userID.String(), orgID, nil)
adminTuple, err := authtypes.TypeableRole.Tuples(
adminSubject,
authtypes.RelationAssignee,
[]authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeRole, roletypes.SigNozAdminRoleName),
},
orgID,
)
if err != nil {
return nil, err
}
tuples = append(tuples, adminTuple...)
// Grant the admin role to the anonymous user
anonymousSubject := authtypes.MustNewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.String(), orgID, nil)
anonymousTuple, err := authtypes.TypeableRole.Tuples(
anonymousSubject,
authtypes.RelationAssignee,
[]authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeRole, roletypes.SigNozAnonymousRoleName),
},
orgID,
)
if err != nil {
return nil, err
}
tuples = append(tuples, anonymousTuple...)
return tuples, nil
}
func (provider *provider) getManagedRoleTransactionTuples(orgID valuer.UUID) ([]*openfgav1.TupleKey, error) {
transactionsByRole := make(map[string][]*authtypes.Transaction)
for _, register := range provider.registry {
for roleName, txns := range register.MustGetManagedRoleTransactions() {
transactionsByRole[roleName] = append(transactionsByRole[roleName], txns...)
}
}
tuples := make([]*openfgav1.TupleKey, 0)
for roleName, transactions := range transactionsByRole {
for _, txn := range transactions {
typeable := authtypes.MustNewTypeableFromType(txn.Object.Resource.Type, txn.Object.Resource.Name)
txnTuples, err := typeable.Tuples(
authtypes.MustNewSubject(
authtypes.TypeableRole,
roleName,
orgID,
&authtypes.RelationAssignee,
),
txn.Relation,
[]authtypes.Selector{txn.Object.Selector},
orgID,
)
if err != nil {
return nil, err
}
tuples = append(tuples, txnTuples...)
}
}
return tuples, nil
}

View File

@@ -0,0 +1,83 @@
package openfgaserver
import (
"context"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/valuer"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
)
type Server struct {
pkgAuthzService authz.AuthZ
}
func NewOpenfgaServer(ctx context.Context, pkgAuthzService authz.AuthZ) (*Server, error) {
return &Server{
pkgAuthzService: pkgAuthzService,
}, nil
}
func (server *Server) Start(ctx context.Context) error {
return server.pkgAuthzService.Start(ctx)
}
func (server *Server) Stop(ctx context.Context) error {
return server.pkgAuthzService.Stop(ctx)
}
func (server *Server) Check(ctx context.Context, tuple *openfgav1.TupleKey) error {
return server.pkgAuthzService.Check(ctx, tuple)
}
func (server *Server) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, _ []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableUser, claims.UserID, orgID, nil)
if err != nil {
return err
}
tuples, err := typeable.Tuples(subject, relation, selectors, orgID)
if err != nil {
return err
}
err = server.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
}
func (server *Server) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, _ []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.String(), orgID, nil)
if err != nil {
return err
}
tuples, err := typeable.Tuples(subject, relation, selectors, orgID)
if err != nil {
return err
}
err = server.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
}
func (server *Server) BatchCheck(ctx context.Context, tuples []*openfgav1.TupleKey) error {
return server.pkgAuthzService.BatchCheck(ctx, tuples)
}
func (server *Server) ListObjects(ctx context.Context, subject string, relation authtypes.Relation, typeable authtypes.Typeable) ([]*authtypes.Object, error) {
return server.pkgAuthzService.ListObjects(ctx, subject, relation, typeable)
}
func (server *Server) Write(ctx context.Context, additions []*openfgav1.TupleKey, deletions []*openfgav1.TupleKey) error {
return server.pkgAuthzService.Write(ctx, additions, deletions)
}

View File

@@ -11,7 +11,6 @@ import (
"github.com/SigNoz/signoz/pkg/modules/dashboard"
pkgimpldashboard "github.com/SigNoz/signoz/pkg/modules/dashboard/impldashboard"
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/querier"
"github.com/SigNoz/signoz/pkg/queryparser"
"github.com/SigNoz/signoz/pkg/types"
@@ -26,13 +25,11 @@ type module struct {
pkgDashboardModule dashboard.Module
store dashboardtypes.Store
settings factory.ScopedProviderSettings
roleSetter role.Setter
granter role.Granter
querier querier.Querier
licensing licensing.Licensing
}
func NewModule(store dashboardtypes.Store, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, roleSetter role.Setter, granter role.Granter, queryParser queryparser.QueryParser, querier querier.Querier, licensing licensing.Licensing) dashboard.Module {
func NewModule(store dashboardtypes.Store, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, queryParser queryparser.QueryParser, querier querier.Querier, licensing licensing.Licensing) dashboard.Module {
scopedProviderSettings := factory.NewScopedProviderSettings(settings, "github.com/SigNoz/signoz/ee/modules/dashboard/impldashboard")
pkgDashboardModule := pkgimpldashboard.NewModule(store, settings, analytics, orgGetter, queryParser)
@@ -40,8 +37,6 @@ func NewModule(store dashboardtypes.Store, settings factory.ProviderSettings, an
pkgDashboardModule: pkgDashboardModule,
store: store,
settings: scopedProviderSettings,
roleSetter: roleSetter,
granter: granter,
querier: querier,
licensing: licensing,
}
@@ -61,29 +56,6 @@ func (module *module) CreatePublic(ctx context.Context, orgID valuer.UUID, publi
return errors.Newf(errors.TypeAlreadyExists, dashboardtypes.ErrCodePublicDashboardAlreadyExists, "dashboard with id %s is already public", storablePublicDashboard.DashboardID)
}
role, err := module.roleSetter.GetOrCreate(ctx, orgID, roletypes.NewRole(roletypes.SigNozAnonymousRoleName, roletypes.SigNozAnonymousRoleDescription, roletypes.RoleTypeManaged, orgID))
if err != nil {
return err
}
err = module.granter.Grant(ctx, orgID, roletypes.SigNozAnonymousRoleName, authtypes.MustNewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.StringValue(), orgID, nil))
if err != nil {
return err
}
additionObject := authtypes.MustNewObject(
authtypes.Resource{
Name: dashboardtypes.TypeableMetaResourcePublicDashboard.Name(),
Type: authtypes.TypeMetaResource,
},
authtypes.MustNewSelector(authtypes.TypeMetaResource, publicDashboard.ID.String()),
)
err = module.roleSetter.PatchObjects(ctx, orgID, role.Name, authtypes.RelationRead, []*authtypes.Object{additionObject}, nil)
if err != nil {
return err
}
err = module.store.CreatePublic(ctx, dashboardtypes.NewStorablePublicDashboardFromPublicDashboard(publicDashboard))
if err != nil {
return err
@@ -128,6 +100,7 @@ func (module *module) GetPublicDashboardSelectorsAndOrg(ctx context.Context, id
return []authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeMetaResource, id.StringValue()),
authtypes.MustNewSelector(authtypes.TypeMetaResource, authtypes.WildCardSelectorString),
}, storableDashboard.OrgID, nil
}
@@ -190,29 +163,6 @@ func (module *module) DeletePublic(ctx context.Context, orgID valuer.UUID, dashb
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
publicDashboard, err := module.GetPublic(ctx, orgID, dashboardID)
if err != nil {
return err
}
role, err := module.roleSetter.GetOrCreate(ctx, orgID, roletypes.NewRole(roletypes.SigNozAnonymousRoleName, roletypes.SigNozAnonymousRoleDescription, roletypes.RoleTypeManaged, orgID))
if err != nil {
return err
}
deletionObject := authtypes.MustNewObject(
authtypes.Resource{
Name: dashboardtypes.TypeableMetaResourcePublicDashboard.Name(),
Type: authtypes.TypeMetaResource,
},
authtypes.MustNewSelector(authtypes.TypeMetaResource, publicDashboard.ID.String()),
)
err = module.roleSetter.PatchObjects(ctx, orgID, role.Name, authtypes.RelationRead, nil, []*authtypes.Object{deletionObject})
if err != nil {
return err
}
err = module.store.DeletePublic(ctx, dashboardID.StringValue())
if err != nil {
return err
@@ -250,10 +200,6 @@ func (module *module) GetByMetricNames(ctx context.Context, orgID valuer.UUID, m
return module.pkgDashboardModule.GetByMetricNames(ctx, orgID, metricNames)
}
func (module *module) MustGetTypeables() []authtypes.Typeable {
return module.pkgDashboardModule.MustGetTypeables()
}
func (module *module) List(ctx context.Context, orgID valuer.UUID) ([]*dashboardtypes.Dashboard, error) {
return module.pkgDashboardModule.List(ctx, orgID)
}
@@ -266,34 +212,27 @@ func (module *module) LockUnlock(ctx context.Context, orgID valuer.UUID, id valu
return module.pkgDashboardModule.LockUnlock(ctx, orgID, id, updatedBy, role, lock)
}
func (module *module) deletePublic(ctx context.Context, orgID valuer.UUID, dashboardID valuer.UUID) error {
publicDashboard, err := module.store.GetPublic(ctx, dashboardID.String())
if err != nil {
return err
}
role, err := module.roleSetter.GetOrCreate(ctx, orgID, roletypes.NewRole(roletypes.SigNozAnonymousRoleName, roletypes.SigNozAnonymousRoleDescription, roletypes.RoleTypeManaged, orgID))
if err != nil {
return err
}
deletionObject := authtypes.MustNewObject(
authtypes.Resource{
Name: dashboardtypes.TypeableMetaResourcePublicDashboard.Name(),
Type: authtypes.TypeMetaResource,
},
authtypes.MustNewSelector(authtypes.TypeMetaResource, publicDashboard.ID.String()),
)
err = module.roleSetter.PatchObjects(ctx, orgID, role.Name, authtypes.RelationRead, nil, []*authtypes.Object{deletionObject})
if err != nil {
return err
}
err = module.store.DeletePublic(ctx, dashboardID.StringValue())
if err != nil {
return err
}
return nil
func (module *module) MustGetTypeables() []authtypes.Typeable {
return module.pkgDashboardModule.MustGetTypeables()
}
func (module *module) MustGetManagedRoleTransactions() map[string][]*authtypes.Transaction {
return map[string][]*authtypes.Transaction{
roletypes.SigNozAnonymousRoleName: {
{
Relation: authtypes.RelationRead,
Object: *authtypes.MustNewObject(
authtypes.Resource{
Type: authtypes.TypeMetaResource,
Name: dashboardtypes.TypeableMetaResourcePublicDashboard.Name(),
},
authtypes.MustNewSelector(authtypes.TypeMetaResource, "*"),
),
},
},
}
}
func (module *module) deletePublic(ctx context.Context, _ valuer.UUID, dashboardID valuer.UUID) error {
return module.store.DeletePublic(ctx, dashboardID.StringValue())
}

View File

@@ -1,165 +0,0 @@
package implrole
import (
"context"
"slices"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/licensing"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
type setter struct {
store roletypes.Store
authz authz.AuthZ
licensing licensing.Licensing
registry []role.RegisterTypeable
}
func NewSetter(store roletypes.Store, authz authz.AuthZ, licensing licensing.Licensing, registry []role.RegisterTypeable) role.Setter {
return &setter{
store: store,
authz: authz,
licensing: licensing,
registry: registry,
}
}
func (setter *setter) Create(ctx context.Context, orgID valuer.UUID, role *roletypes.Role) error {
_, err := setter.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
return setter.store.Create(ctx, roletypes.NewStorableRoleFromRole(role))
}
func (setter *setter) GetOrCreate(ctx context.Context, orgID valuer.UUID, role *roletypes.Role) (*roletypes.Role, error) {
_, err := setter.licensing.GetActive(ctx, orgID)
if err != nil {
return nil, errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
existingRole, err := setter.store.GetByOrgIDAndName(ctx, role.OrgID, role.Name)
if err != nil {
if !errors.Ast(err, errors.TypeNotFound) {
return nil, err
}
}
if existingRole != nil {
return roletypes.NewRoleFromStorableRole(existingRole), nil
}
err = setter.store.Create(ctx, roletypes.NewStorableRoleFromRole(role))
if err != nil {
return nil, err
}
return role, nil
}
func (setter *setter) GetResources(_ context.Context) []*authtypes.Resource {
typeables := make([]authtypes.Typeable, 0)
for _, register := range setter.registry {
typeables = append(typeables, register.MustGetTypeables()...)
}
// role module cannot self register itself!
typeables = append(typeables, setter.MustGetTypeables()...)
resources := make([]*authtypes.Resource, 0)
for _, typeable := range typeables {
resources = append(resources, &authtypes.Resource{Name: typeable.Name(), Type: typeable.Type()})
}
return resources
}
func (setter *setter) GetObjects(ctx context.Context, orgID valuer.UUID, id valuer.UUID, relation authtypes.Relation) ([]*authtypes.Object, error) {
storableRole, err := setter.store.Get(ctx, orgID, id)
if err != nil {
return nil, err
}
objects := make([]*authtypes.Object, 0)
for _, resource := range setter.GetResources(ctx) {
if slices.Contains(authtypes.TypeableRelations[resource.Type], relation) {
resourceObjects, err := setter.
authz.
ListObjects(
ctx,
authtypes.MustNewSubject(authtypes.TypeableRole, storableRole.ID.String(), orgID, &authtypes.RelationAssignee),
relation,
authtypes.MustNewTypeableFromType(resource.Type, resource.Name),
)
if err != nil {
return nil, err
}
objects = append(objects, resourceObjects...)
}
}
return objects, nil
}
func (setter *setter) Patch(ctx context.Context, orgID valuer.UUID, role *roletypes.Role) error {
_, err := setter.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
return setter.store.Update(ctx, orgID, roletypes.NewStorableRoleFromRole(role))
}
func (setter *setter) PatchObjects(ctx context.Context, orgID valuer.UUID, name string, relation authtypes.Relation, additions, deletions []*authtypes.Object) error {
_, err := setter.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
additionTuples, err := roletypes.GetAdditionTuples(name, orgID, relation, additions)
if err != nil {
return err
}
deletionTuples, err := roletypes.GetDeletionTuples(name, orgID, relation, deletions)
if err != nil {
return err
}
err = setter.authz.Write(ctx, additionTuples, deletionTuples)
if err != nil {
return err
}
return nil
}
func (setter *setter) Delete(ctx context.Context, orgID valuer.UUID, id valuer.UUID) error {
_, err := setter.licensing.GetActive(ctx, orgID)
if err != nil {
return errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "a valid license is not available").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
}
storableRole, err := setter.store.Get(ctx, orgID, id)
if err != nil {
return err
}
role := roletypes.NewRoleFromStorableRole(storableRole)
err = role.CanEditDelete()
if err != nil {
return err
}
return setter.store.Delete(ctx, orgID, id)
}
func (setter *setter) MustGetTypeables() []authtypes.Typeable {
return []authtypes.Typeable{authtypes.TypeableRole, roletypes.TypeableResourcesRoles}
}

View File

@@ -10,7 +10,6 @@ import (
"github.com/SigNoz/signoz/ee/query-service/usage"
"github.com/SigNoz/signoz/pkg/alertmanager"
"github.com/SigNoz/signoz/pkg/global"
"github.com/SigNoz/signoz/pkg/http/handler"
"github.com/SigNoz/signoz/pkg/http/middleware"
querierAPI "github.com/SigNoz/signoz/pkg/querier"
baseapp "github.com/SigNoz/signoz/pkg/query-service/app"
@@ -22,8 +21,6 @@ import (
rules "github.com/SigNoz/signoz/pkg/query-service/rules"
"github.com/SigNoz/signoz/pkg/queryparser"
"github.com/SigNoz/signoz/pkg/signoz"
"github.com/SigNoz/signoz/pkg/types/ctxtypes"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
"github.com/SigNoz/signoz/pkg/version"
"github.com/gorilla/mux"
)
@@ -109,22 +106,7 @@ func (ah *APIHandler) RegisterRoutes(router *mux.Router, am *middleware.AuthZ) {
router.HandleFunc("/api/v4/query_range", am.ViewAccess(ah.queryRangeV4)).Methods(http.MethodPost)
// v5
router.Handle("/api/v5/query_range", handler.New(am.ViewAccess(ah.queryRangeV5), handler.OpenAPIDef{
ID: "QueryRangeV5",
Tags: []string{"query"},
Summary: "Query range",
Description: "Execute a composite query over a time range. Supports builder queries (traces, logs, metrics), formulas, joins, trace operators, PromQL, and ClickHouse SQL.",
Request: new(qbtypes.QueryRangeRequest),
RequestContentType: "application/json",
Response: new(qbtypes.QueryRangeResponse),
ResponseContentType: "application/json",
SuccessStatusCode: http.StatusOK,
ErrorStatusCodes: []int{http.StatusBadRequest},
SecuritySchemes: []handler.OpenAPISecurityScheme{
{Name: ctxtypes.AuthTypeAPIKey.StringValue(), Scopes: []string{"VIEWER"}},
{Name: ctxtypes.AuthTypeTokenizer.StringValue(), Scopes: []string{"VIEWER"}},
},
})).Methods(http.MethodPost)
router.HandleFunc("/api/v5/query_range", am.ViewAccess(ah.queryRangeV5)).Methods(http.MethodPost)
router.HandleFunc("/api/v5/substitute_vars", am.ViewAccess(ah.QuerierAPI.ReplaceVariables)).Methods(http.MethodPost)

View File

@@ -211,7 +211,7 @@ func (s Server) HealthCheckStatus() chan healthcheck.Status {
func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*http.Server, error) {
r := baseapp.NewRouter()
am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger(), s.signoz.Modules.OrgGetter, s.signoz.Authz, s.signoz.Modules.RoleGetter)
am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger(), s.signoz.Modules.OrgGetter, s.signoz.Authz)
r.Use(otelmux.Middleware(
"apiserver",

View File

@@ -15,7 +15,7 @@ import (
"github.com/SigNoz/signoz/pkg/query-service/common"
"github.com/SigNoz/signoz/pkg/query-service/model"
"github.com/SigNoz/signoz/pkg/transition"
ruletypes "github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/valuer"
querierV2 "github.com/SigNoz/signoz/pkg/query-service/app/querier/v2"
@@ -63,6 +63,8 @@ type AnomalyRule struct {
seasonality anomaly.Seasonality
}
var _ baserules.Rule = (*AnomalyRule)(nil)
func NewAnomalyRule(
id string,
orgID valuer.UUID,
@@ -490,7 +492,7 @@ func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (int, error) {
continue
}
if a.State == model.StatePending && ts.Sub(a.ActiveAt) >= r.HoldDuration() {
if a.State == model.StatePending && ts.Sub(a.ActiveAt) >= r.HoldDuration().Duration() {
a.State = model.StateFiring
a.FiredAt = ts
state := model.StateFiring
@@ -553,7 +555,7 @@ func (r *AnomalyRule) String() string {
ar := ruletypes.PostableRule{
AlertName: r.Name(),
RuleCondition: r.Condition(),
EvalWindow: ruletypes.Duration(r.EvalWindow()),
EvalWindow: r.EvalWindow(),
Labels: r.Labels().Map(),
Annotations: r.Annotations().Map(),
PreferredChannels: r.PreferredChannels(),

View File

@@ -40,7 +40,7 @@ func TestAnomalyRule_NoData_AlertOnAbsent(t *testing.T) {
// Test basic AlertOnAbsent functionality (without AbsentFor grace period)
baseTime := time.Unix(1700000000, 0)
evalWindow := 5 * time.Minute
evalWindow := valuer.MustParseTextDuration("5m")
evalTime := baseTime.Add(5 * time.Minute)
target := 500.0
@@ -50,8 +50,8 @@ func TestAnomalyRule_NoData_AlertOnAbsent(t *testing.T) {
AlertType: ruletypes.AlertTypeMetric,
RuleType: RuleTypeAnomaly,
Evaluation: &ruletypes.EvaluationEnvelope{Kind: ruletypes.RollingEvaluation, Spec: ruletypes.RollingWindow{
EvalWindow: ruletypes.Duration(evalWindow),
Frequency: ruletypes.Duration(1 * time.Minute),
EvalWindow: evalWindow,
Frequency: valuer.MustParseTextDuration("1m"),
}},
RuleCondition: &ruletypes.RuleCondition{
CompareOp: ruletypes.ValueIsAbove,
@@ -147,7 +147,7 @@ func TestAnomalyRule_NoData_AbsentFor(t *testing.T) {
// 3. Alert fires only if t2 - t1 > AbsentFor
baseTime := time.Unix(1700000000, 0)
evalWindow := 5 * time.Minute
evalWindow := valuer.MustParseTextDuration("5m")
// Set target higher than test data so regular threshold alerts don't fire
target := 500.0
@@ -157,8 +157,8 @@ func TestAnomalyRule_NoData_AbsentFor(t *testing.T) {
AlertType: ruletypes.AlertTypeMetric,
RuleType: RuleTypeAnomaly,
Evaluation: &ruletypes.EvaluationEnvelope{Kind: ruletypes.RollingEvaluation, Spec: ruletypes.RollingWindow{
EvalWindow: ruletypes.Duration(evalWindow),
Frequency: ruletypes.Duration(time.Minute),
EvalWindow: evalWindow,
Frequency: valuer.MustParseTextDuration("1m"),
}},
RuleCondition: &ruletypes.RuleCondition{
CompareOp: ruletypes.ValueIsAbove,

View File

@@ -48,7 +48,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
rules = append(rules, tr)
// create ch rule task for evaluation
task = newTask(baserules.TaskTypeCh, opts.TaskName, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
task = newTask(baserules.TaskTypeCh, opts.TaskName, evaluation.GetFrequency().Duration(), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
} else if opts.Rule.RuleType == ruletypes.RuleTypeProm {
@@ -72,7 +72,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
rules = append(rules, pr)
// create promql rule task for evaluation
task = newTask(baserules.TaskTypeProm, opts.TaskName, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
task = newTask(baserules.TaskTypeProm, opts.TaskName, evaluation.GetFrequency().Duration(), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
} else if opts.Rule.RuleType == ruletypes.RuleTypeAnomaly {
// create anomaly rule
@@ -96,7 +96,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
rules = append(rules, ar)
// create anomaly rule task for evaluation
task = newTask(baserules.TaskTypeCh, opts.TaskName, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
task = newTask(baserules.TaskTypeCh, opts.TaskName, evaluation.GetFrequency().Duration(), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
} else {
return nil, fmt.Errorf("unsupported rule type %s. Supported types: %s, %s", opts.Rule.RuleType, ruletypes.RuleTypeProm, ruletypes.RuleTypeThreshold)
@@ -213,8 +213,7 @@ func TestNotification(opts baserules.PrepareTestRuleOptions) (int, *basemodel.Ap
return alertsFound, nil
}
// newTask returns an appropriate group for
// rule type
// newTask returns an appropriate group for the rule type
func newTask(taskType baserules.TaskType, name string, frequency time.Duration, rules []baserules.Rule, opts *baserules.ManagerOptions, notify baserules.NotifyFunc, maintenanceStore ruletypes.MaintenanceStore, orgID valuer.UUID) baserules.Task {
if taskType == baserules.TaskTypeCh {
return baserules.NewRuleTask(name, "", frequency, rules, opts, notify, maintenanceStore, orgID)

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="184" height="188" fill="none" viewBox="0 0 184 188"><path fill="#f3b01c" d="M108.092 130.021c18.166-2.018 35.293-11.698 44.723-27.854-4.466 39.961-48.162 65.218-83.83 49.711-3.286-1.425-6.115-3.796-8.056-6.844-8.016-12.586-10.65-28.601-6.865-43.135 10.817 18.668 32.81 30.111 54.028 28.122"/><path fill="#8d2676" d="M53.401 90.174c-7.364 17.017-7.682 36.94 1.345 53.336-31.77-23.902-31.423-75.052-.388-98.715 2.87-2.187 6.282-3.485 9.86-3.683 14.713-.776 29.662 4.91 40.146 15.507-21.3.212-42.046 13.857-50.963 33.555"/><path fill="#ee342f" d="M114.637 61.855C103.89 46.87 87.069 36.668 68.639 36.358c35.625-16.17 79.446 10.047 84.217 48.807.444 3.598-.139 7.267-1.734 10.512-6.656 13.518-18.998 24.002-33.42 27.882 10.567-19.599 9.263-43.544-3.065-61.704"/></svg>

After

Width:  |  Height:  |  Size: 811 B

View File

@@ -140,10 +140,7 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
}, [selectedValues, allAvailableValues, enableAllSelection]);
// Define allOptionShown earlier in the code
const allOptionShown = useMemo(
() => value === ALL_SELECTED_VALUE || value === 'ALL',
[value],
);
const allOptionShown = value === ALL_SELECTED_VALUE;
// Value passed to the underlying Ant Select component
const displayValue = useMemo(

View File

@@ -18,8 +18,8 @@ import { useWidgetsByDynamicVariableId } from 'hooks/dashboard/useWidgetsByDynam
import { getWidgetsHavingDynamicVariableAttribute } from 'hooks/dashboard/utils';
import { useGetFieldValues } from 'hooks/dynamicVariables/useGetFieldValues';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { commaValuesParser } from 'lib/dashbaordVariables/customCommaValuesParser';
import sortValues from 'lib/dashbaordVariables/sortVariableValues';
import { commaValuesParser } from 'lib/dashboardVariables/customCommaValuesParser';
import sortValues from 'lib/dashboardVariables/sortVariableValues';
import { isEmpty, map } from 'lodash-es';
import {
ArrowLeft,

View File

@@ -1,10 +1,11 @@
import { memo, useMemo } from 'react';
import { commaValuesParser } from 'lib/dashbaordVariables/customCommaValuesParser';
import sortValues from 'lib/dashbaordVariables/sortVariableValues';
import { memo, useEffect, useMemo } from 'react';
import { commaValuesParser } from 'lib/dashboardVariables/customCommaValuesParser';
import sortValues from 'lib/dashboardVariables/sortVariableValues';
import SelectVariableInput from './SelectVariableInput';
import { useDashboardVariableSelectHelper } from './useDashboardVariableSelectHelper';
import { VariableItemProps } from './VariableItem';
import { customVariableSelectStrategy } from './variableSelectStrategy/customVariableSelectStrategy';
type CustomVariableInputProps = Pick<
VariableItemProps,
@@ -29,16 +30,31 @@ function CustomVariableInput({
onChange,
onDropdownVisibleChange,
handleClear,
applyDefaultIfNeeded,
} = useDashboardVariableSelectHelper({
variableData,
optionsData,
onValueUpdate,
strategy: customVariableSelectStrategy,
});
// Apply default on mount — options are available synchronously for custom variables
// eslint-disable-next-line react-hooks/exhaustive-deps
useEffect(applyDefaultIfNeeded, []);
const selectOptions = useMemo(
() =>
optionsData.map((option) => ({
label: option.toString(),
value: option.toString(),
})),
[optionsData],
);
return (
<SelectVariableInput
variableId={variableData.id}
options={optionsData}
options={selectOptions}
value={value}
onChange={onChange}
onDropdownVisibleChange={onDropdownVisibleChange}

View File

@@ -13,7 +13,6 @@ import { AppState } from 'store/reducers';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import { GlobalReducer } from 'types/reducer/globalTime';
import DynamicVariableSelection from './DynamicVariableSelection';
import { onUpdateVariableNode } from './util';
import VariableItem from './VariableItem';
@@ -153,14 +152,7 @@ function DashboardVariableSelection(): JSX.Element | null {
{sortedVariablesArray.map((variable) => {
const key = `${variable.name}${variable.id}${variable.order}`;
return variable.type === 'DYNAMIC' ? (
<DynamicVariableSelection
key={key}
existingVariables={dashboardVariables}
variableData={variable}
onValueUpdate={onValueUpdate}
/>
) : (
return (
<VariableItem
key={key}
existingVariables={dashboardVariables}

View File

@@ -0,0 +1,343 @@
import { memo, useCallback, useMemo, useState } from 'react';
import { useQuery } from 'react-query';
import { useSelector } from 'react-redux';
import { getFieldValues } from 'api/dynamicVariables/getFieldValues';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import useDebounce from 'hooks/useDebounce';
import { isEmpty } from 'lodash-es';
import { AppState } from 'store/reducers';
import { GlobalReducer } from 'types/reducer/globalTime';
import { isRetryableError as checkIfRetryableError } from 'utils/errorUtils';
import SelectVariableInput from './SelectVariableInput';
import { useDashboardVariableSelectHelper } from './useDashboardVariableSelectHelper';
import { getOptionsForDynamicVariable } from './util';
import { VariableItemProps } from './VariableItem';
import { dynamicVariableSelectStrategy } from './variableSelectStrategy/dynamicVariableSelectStrategy';
import './DashboardVariableSelection.styles.scss';
type DynamicVariableInputProps = Pick<
VariableItemProps,
'variableData' | 'onValueUpdate' | 'existingVariables'
>;
// eslint-disable-next-line sonarjs/cognitive-complexity
function DynamicVariableInput({
variableData,
onValueUpdate,
existingVariables,
}: DynamicVariableInputProps): JSX.Element {
const [optionsData, setOptionsData] = useState<(string | number | boolean)[]>(
[],
);
const [errorMessage, setErrorMessage] = useState<null | string>(null);
const [isRetryableError, setIsRetryableError] = useState<boolean>(true);
const [isComplete, setIsComplete] = useState<boolean>(false);
const [filteredOptionsData, setFilteredOptionsData] = useState<
(string | number | boolean)[]
>([]);
const [relatedValues, setRelatedValues] = useState<string[]>([]);
const [originalRelatedValues, setOriginalRelatedValues] = useState<string[]>(
[],
);
// Track dropdown open state for auto-checking new values
const [isDropdownOpen, setIsDropdownOpen] = useState<boolean>(false);
const [apiSearchText, setApiSearchText] = useState<string>('');
const debouncedApiSearchText = useDebounce(apiSearchText, DEBOUNCE_DELAY);
// Build a memoized list of all currently available option strings (normalized + related)
const allAvailableOptionStrings = useMemo(
() => [
...new Set([
...optionsData.map((v) => v.toString()),
...relatedValues.map((v) => v.toString()),
]),
],
[optionsData, relatedValues],
);
const {
value,
tempSelection,
setTempSelection,
handleClear,
enableSelectAll,
defaultValue,
applyDefaultIfNeeded,
onChange,
onDropdownVisibleChange,
} = useDashboardVariableSelectHelper({
variableData,
optionsData,
onValueUpdate,
strategy: dynamicVariableSelectStrategy,
allAvailableOptionStrings,
});
// Create a dependency key from all dynamic variables
const dynamicVariablesKey = useMemo(() => {
if (!existingVariables) {
return 'no_variables';
}
const dynamicVars = Object.values(existingVariables)
.filter((v) => v.type === 'DYNAMIC')
.map(
(v) => `${v.name || 'unnamed'}:${JSON.stringify(v.selectedValue || null)}`,
)
.join('|');
return dynamicVars || 'no_dynamic_variables';
}, [existingVariables]);
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
// existing query is the query made from the other dynamic variables around this one with there current values
// for e.g. k8s.namespace.name IN ["zeus", "gene"] AND doc_op_type IN ["test"]
// eslint-disable-next-line sonarjs/cognitive-complexity
const existingQuery = useMemo(() => {
if (!existingVariables || !variableData.dynamicVariablesAttribute) {
return '';
}
const queryParts: string[] = [];
Object.entries(existingVariables).forEach(([, variable]) => {
// Skip the current variable being processed
if (variable.id === variableData.id) {
return;
}
// Only include dynamic variables that have selected values and are not selected as ALL
if (
variable.type === 'DYNAMIC' &&
variable.dynamicVariablesAttribute &&
variable.selectedValue &&
!isEmpty(variable.selectedValue) &&
(variable.showALLOption ? !variable.allSelected : true)
) {
const attribute = variable.dynamicVariablesAttribute;
const values = Array.isArray(variable.selectedValue)
? variable.selectedValue
: [variable.selectedValue];
// Filter out empty values and convert to strings
const validValues = values
.filter((val) => val !== null && val !== undefined && val !== '')
.map((val) => val.toString());
if (validValues.length > 0) {
// Format values for query - wrap strings in quotes, keep numbers as is
const formattedValues = validValues.map((val) => {
// Check if value is a number
const numValue = Number(val);
if (!Number.isNaN(numValue) && Number.isFinite(numValue)) {
return val; // Keep as number
}
// Escape single quotes and wrap in quotes
return `'${val.replace(/'/g, "\\'")}'`;
});
if (formattedValues.length === 1) {
queryParts.push(`${attribute} = ${formattedValues[0]}`);
} else {
queryParts.push(`${attribute} IN [${formattedValues.join(', ')}]`);
}
}
}
});
return queryParts.join(' AND ');
}, [
existingVariables,
variableData.id,
variableData.dynamicVariablesAttribute,
]);
// Wrap the hook's onDropdownVisibleChange to also track isDropdownOpen and handle cleanup
const handleSelectDropdownVisibilityChange = useCallback(
(visible: boolean): void => {
setIsDropdownOpen(visible);
onDropdownVisibleChange(visible);
if (!visible) {
setFilteredOptionsData(optionsData);
setRelatedValues(originalRelatedValues);
setApiSearchText('');
}
},
[onDropdownVisibleChange, optionsData, originalRelatedValues],
);
const { isLoading, refetch } = useQuery(
[
REACT_QUERY_KEY.DASHBOARD_BY_ID,
variableData.name || `variable_${variableData.id}`,
dynamicVariablesKey,
minTime,
maxTime,
debouncedApiSearchText,
variableData.dynamicVariablesSource,
variableData.dynamicVariablesAttribute,
],
{
enabled:
variableData.type === 'DYNAMIC' &&
!!variableData.dynamicVariablesSource &&
!!variableData.dynamicVariablesAttribute,
queryFn: () =>
getFieldValues(
variableData.dynamicVariablesSource?.toLowerCase() === 'all telemetry'
? undefined
: (variableData.dynamicVariablesSource?.toLowerCase() as
| 'traces'
| 'logs'
| 'metrics'),
variableData.dynamicVariablesAttribute,
debouncedApiSearchText,
minTime,
maxTime,
existingQuery,
),
onSuccess: (data) => {
const newNormalizedValues = data.data?.normalizedValues || [];
const newRelatedValues = data.data?.relatedValues || [];
if (!debouncedApiSearchText) {
setOptionsData(newNormalizedValues);
setIsComplete(data.data?.complete || false);
}
setFilteredOptionsData(newNormalizedValues);
setRelatedValues(newRelatedValues);
setOriginalRelatedValues(newRelatedValues);
// Only run auto-check logic when necessary to avoid performance issues
if (variableData.allSelected && isDropdownOpen) {
// Build the latest full list from API (normalized + related)
const latestValues = [
...new Set([
...newNormalizedValues.map((v) => v.toString()),
...newRelatedValues.map((v) => v.toString()),
]),
];
// Update temp selection to exactly reflect latest API values when ALL is active
const currentStrings = Array.isArray(tempSelection)
? tempSelection.map((v) => v.toString())
: tempSelection
? [tempSelection.toString()]
: [];
const areSame =
currentStrings.length === latestValues.length &&
latestValues.every((v) => currentStrings.includes(v));
if (!areSame) {
setTempSelection(latestValues);
}
}
// Apply default if no value is selected (e.g., new variable, first load)
if (!debouncedApiSearchText) {
const allNewOptions = [
...new Set([
...newNormalizedValues.map((v) => v.toString()),
...newRelatedValues.map((v) => v.toString()),
]),
];
applyDefaultIfNeeded(allNewOptions);
}
},
onError: (error: any) => {
if (error) {
let message = SOMETHING_WENT_WRONG;
if (error?.message) {
message = error?.message;
} else {
message =
'Please make sure configuration is valid and you have required setup and permissions';
}
setErrorMessage(message);
// Check if error is retryable (5xx) or not (4xx)
const isRetryable = checkIfRetryableError(error);
setIsRetryableError(isRetryable);
}
},
},
);
const handleRetry = useCallback((): void => {
setErrorMessage(null);
setIsRetryableError(true);
refetch();
}, [refetch]);
const handleSearch = useCallback(
(text: string) => {
if (isComplete) {
if (!text) {
setFilteredOptionsData(optionsData);
setRelatedValues(originalRelatedValues);
return;
}
const lowerText = text.toLowerCase();
setFilteredOptionsData(
optionsData.filter((option) =>
option.toString().toLowerCase().includes(lowerText),
),
);
setRelatedValues(
originalRelatedValues.filter((val) =>
val.toLowerCase().includes(lowerText),
),
);
} else {
setApiSearchText(text);
}
},
[isComplete, optionsData, originalRelatedValues],
);
const selectOptions = useMemo(
() =>
getOptionsForDynamicVariable(filteredOptionsData || [], relatedValues || []),
[filteredOptionsData, relatedValues],
);
return (
<SelectVariableInput
variableId={variableData.id}
options={selectOptions}
value={value}
onChange={onChange}
onDropdownVisibleChange={handleSelectDropdownVisibilityChange}
onClear={handleClear}
enableSelectAll={enableSelectAll}
defaultValue={defaultValue}
isMultiSelect={variableData.multiSelect}
// dynamic variable specific + API related props
loading={isLoading}
errorMessage={errorMessage}
onRetry={handleRetry}
isDynamicVariable
showRetryButton={isRetryableError}
showIncompleteDataMessage={!isComplete && filteredOptionsData.length > 0}
onSearch={handleSearch}
/>
);
}
export default memo(DynamicVariableInput);

View File

@@ -1,602 +0,0 @@
/* eslint-disable sonarjs/cognitive-complexity */
/* eslint-disable no-nested-ternary */
import { useCallback, useEffect, useMemo, useState } from 'react';
import { useQuery } from 'react-query';
import { useSelector } from 'react-redux';
import { InfoCircleOutlined } from '@ant-design/icons';
import { Tooltip, Typography } from 'antd';
import { getFieldValues } from 'api/dynamicVariables/getFieldValues';
import { CustomMultiSelect, CustomSelect } from 'components/NewSelect';
import { SOMETHING_WENT_WRONG } from 'constants/api';
import { DEBOUNCE_DELAY } from 'constants/queryBuilderFilterConfig';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import useDebounce from 'hooks/useDebounce';
import { isEmpty, isUndefined } from 'lodash-es';
import { AppState } from 'store/reducers';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import { GlobalReducer } from 'types/reducer/globalTime';
import { isRetryableError as checkIfRetryableError } from 'utils/errorUtils';
import { popupContainer } from 'utils/selectPopupContainer';
import { ALL_SELECT_VALUE } from '../utils';
import { SelectItemStyle } from './styles';
import {
areArraysEqual,
getOptionsForDynamicVariable,
getSelectValue,
uniqueValues,
} from './util';
import './DashboardVariableSelection.styles.scss';
interface DynamicVariableSelectionProps {
variableData: IDashboardVariable;
existingVariables: Record<string, IDashboardVariable>;
onValueUpdate: (
name: string,
id: string,
arg1: IDashboardVariable['selectedValue'],
allSelected: boolean,
haveCustomValuesSelected?: boolean,
) => void;
}
function DynamicVariableSelection({
variableData,
onValueUpdate,
existingVariables,
}: DynamicVariableSelectionProps): JSX.Element {
const [optionsData, setOptionsData] = useState<(string | number | boolean)[]>(
[],
);
const [errorMessage, setErrorMessage] = useState<null | string>(null);
const [isRetryableError, setIsRetryableError] = useState<boolean>(true);
const [isComplete, setIsComplete] = useState<boolean>(false);
const [filteredOptionsData, setFilteredOptionsData] = useState<
(string | number | boolean)[]
>([]);
const [relatedValues, setRelatedValues] = useState<string[]>([]);
const [originalRelatedValues, setOriginalRelatedValues] = useState<string[]>(
[],
);
const [tempSelection, setTempSelection] = useState<
string | string[] | undefined
>(undefined);
// Track dropdown open state for auto-checking new values
const [isDropdownOpen, setIsDropdownOpen] = useState<boolean>(false);
// Create a dependency key from all dynamic variables
const dynamicVariablesKey = useMemo(() => {
if (!existingVariables) {
return 'no_variables';
}
const dynamicVars = Object.values(existingVariables)
.filter((v) => v.type === 'DYNAMIC')
.map(
(v) => `${v.name || 'unnamed'}:${JSON.stringify(v.selectedValue || null)}`,
)
.join('|');
return dynamicVars || 'no_dynamic_variables';
}, [existingVariables]);
const [apiSearchText, setApiSearchText] = useState<string>('');
const debouncedApiSearchText = useDebounce(apiSearchText, DEBOUNCE_DELAY);
const { maxTime, minTime } = useSelector<AppState, GlobalReducer>(
(state) => state.globalTime,
);
// existing query is the query made from the other dynamic variables around this one with there current values
// for e.g. k8s.namespace.name IN ["zeus", "gene"] AND doc_op_type IN ["test"]
const existingQuery = useMemo(() => {
if (!existingVariables || !variableData.dynamicVariablesAttribute) {
return '';
}
const queryParts: string[] = [];
Object.entries(existingVariables).forEach(([, variable]) => {
// Skip the current variable being processed
if (variable.id === variableData.id) {
return;
}
// Only include dynamic variables that have selected values and are not selected as ALL
if (
variable.type === 'DYNAMIC' &&
variable.dynamicVariablesAttribute &&
variable.selectedValue &&
!isEmpty(variable.selectedValue) &&
(variable.showALLOption ? !variable.allSelected : true)
) {
const attribute = variable.dynamicVariablesAttribute;
const values = Array.isArray(variable.selectedValue)
? variable.selectedValue
: [variable.selectedValue];
// Filter out empty values and convert to strings
const validValues = values
.filter((value) => value !== null && value !== undefined && value !== '')
.map((value) => value.toString());
if (validValues.length > 0) {
// Format values for query - wrap strings in quotes, keep numbers as is
const formattedValues = validValues.map((value) => {
// Check if value is a number
const numValue = Number(value);
if (!Number.isNaN(numValue) && Number.isFinite(numValue)) {
return value; // Keep as number
}
// Escape single quotes and wrap in quotes
return `'${value.replace(/'/g, "\\'")}'`;
});
if (formattedValues.length === 1) {
queryParts.push(`${attribute} = ${formattedValues[0]}`);
} else {
queryParts.push(`${attribute} IN [${formattedValues.join(', ')}]`);
}
}
}
});
return queryParts.join(' AND ');
}, [
existingVariables,
variableData.id,
variableData.dynamicVariablesAttribute,
]);
const { isLoading, refetch } = useQuery(
[
REACT_QUERY_KEY.DASHBOARD_BY_ID,
variableData.name || `variable_${variableData.id}`,
dynamicVariablesKey,
minTime,
maxTime,
debouncedApiSearchText,
],
{
enabled: variableData.type === 'DYNAMIC',
queryFn: () =>
getFieldValues(
variableData.dynamicVariablesSource?.toLowerCase() === 'all telemetry'
? undefined
: (variableData.dynamicVariablesSource?.toLowerCase() as
| 'traces'
| 'logs'
| 'metrics'),
variableData.dynamicVariablesAttribute,
debouncedApiSearchText,
minTime,
maxTime,
existingQuery,
),
onSuccess: (data) => {
const newNormalizedValues = data.data?.normalizedValues || [];
const newRelatedValues = data.data?.relatedValues || [];
if (!debouncedApiSearchText) {
setOptionsData(newNormalizedValues);
setIsComplete(data.data?.complete || false);
}
setFilteredOptionsData(newNormalizedValues);
setRelatedValues(newRelatedValues);
setOriginalRelatedValues(newRelatedValues);
// Only run auto-check logic when necessary to avoid performance issues
if (variableData.allSelected && isDropdownOpen) {
// Build the latest full list from API (normalized + related)
const latestValues = [
...new Set([
...newNormalizedValues.map((v) => v.toString()),
...newRelatedValues.map((v) => v.toString()),
]),
];
// Update temp selection to exactly reflect latest API values when ALL is active
const currentStrings = Array.isArray(tempSelection)
? tempSelection.map((v) => v.toString())
: tempSelection
? [tempSelection.toString()]
: [];
const areSame =
currentStrings.length === latestValues.length &&
latestValues.every((v) => currentStrings.includes(v));
if (!areSame) {
setTempSelection(latestValues);
}
}
},
onError: (error: any) => {
if (error) {
let message = SOMETHING_WENT_WRONG;
if (error?.message) {
message = error?.message;
} else {
message =
'Please make sure configuration is valid and you have required setup and permissions';
}
setErrorMessage(message);
// Check if error is retryable (5xx) or not (4xx)
const isRetryable = checkIfRetryableError(error);
setIsRetryableError(isRetryable);
}
},
},
);
const handleChange = useCallback(
(inputValue: string | string[]): void => {
const value = variableData.multiSelect && !inputValue ? [] : inputValue;
if (
value === variableData.selectedValue ||
(Array.isArray(value) &&
Array.isArray(variableData.selectedValue) &&
areArraysEqual(value, variableData.selectedValue))
) {
return;
}
if (variableData.name) {
if (
value === ALL_SELECT_VALUE ||
(Array.isArray(value) && value.includes(ALL_SELECT_VALUE))
) {
// For ALL selection in dynamic variables, pass null to avoid storing values
// The parent component will handle this appropriately
onValueUpdate(variableData.name, variableData.id, null, true);
} else {
// Build union of available options shown in dropdown (normalized + related)
const allAvailableOptionStrings = [
...new Set([
...optionsData.map((v) => v.toString()),
...relatedValues.map((v) => v.toString()),
]),
];
const haveCustomValuesSelected =
Array.isArray(value) &&
!value.every((v) => allAvailableOptionStrings.includes(v.toString()));
onValueUpdate(
variableData.name,
variableData.id,
value,
allAvailableOptionStrings.every((v) => value.includes(v.toString())),
haveCustomValuesSelected,
);
}
}
},
[variableData, onValueUpdate, optionsData, relatedValues],
);
useEffect(() => {
if (
variableData.dynamicVariablesSource &&
variableData.dynamicVariablesAttribute
) {
refetch();
}
}, [
refetch,
variableData.dynamicVariablesSource,
variableData.dynamicVariablesAttribute,
debouncedApiSearchText,
]);
// Build a memoized list of all currently available option strings (normalized + related)
const allAvailableOptionStrings = useMemo(
() => [
...new Set([
...optionsData.map((v) => v.toString()),
...relatedValues.map((v) => v.toString()),
]),
],
[optionsData, relatedValues],
);
const handleSearch = useCallback(
(text: string) => {
if (isComplete) {
if (!text) {
setFilteredOptionsData(optionsData);
setRelatedValues(originalRelatedValues);
return;
}
const localFilteredOptionsData: (string | number | boolean)[] = [];
optionsData.forEach((option) => {
if (option.toString().toLowerCase().includes(text.toLowerCase())) {
localFilteredOptionsData.push(option);
}
});
setFilteredOptionsData(localFilteredOptionsData);
setRelatedValues(
originalRelatedValues.filter((value) =>
value.toLowerCase().includes(text.toLowerCase()),
),
);
} else {
setApiSearchText(text);
}
},
[isComplete, optionsData, originalRelatedValues],
);
const { selectedValue } = variableData;
const selectedValueStringified = useMemo(
() => getSelectValue(selectedValue, variableData),
[selectedValue, variableData],
);
const enableSelectAll = variableData.multiSelect && variableData.showALLOption;
const selectValue =
variableData.allSelected && enableSelectAll
? ALL_SELECT_VALUE
: selectedValueStringified;
// Add a handler for tracking temporary selection changes
const handleTempChange = useCallback(
(inputValue: string | string[]): void => {
// Store the selection in temporary state while dropdown is open
const value = variableData.multiSelect && !inputValue ? [] : inputValue;
const sanitizedValue = uniqueValues(value);
setTempSelection(sanitizedValue);
},
[variableData.multiSelect],
);
// Handle dropdown visibility changes
const handleDropdownVisibleChange = (visible: boolean): void => {
// Update dropdown open state for auto-checking
setIsDropdownOpen(visible);
// Initialize temp selection when opening dropdown
if (visible) {
if (isUndefined(tempSelection) && selectValue === ALL_SELECT_VALUE) {
// When ALL is selected, set selection to exactly the latest available values
const latestAll = [...allAvailableOptionStrings];
setTempSelection(latestAll);
} else {
setTempSelection(getSelectValue(variableData.selectedValue, variableData));
}
}
// Apply changes when closing dropdown
else if (!visible && tempSelection !== undefined) {
// Only call handleChange if there's actually a change in the selection
const currentValue = variableData.selectedValue;
// Helper function to check if arrays have the same elements regardless of order
const areArraysEqualIgnoreOrder = (a: any[], b: any[]): boolean => {
if (a.length !== b.length) {
return false;
}
const sortedA = [...a].sort();
const sortedB = [...b].sort();
return areArraysEqual(sortedA, sortedB);
};
// If ALL was selected before and remains ALL after, skip updating
const wasAllSelected = enableSelectAll && variableData.allSelected;
const isAllSelectedAfter =
enableSelectAll &&
Array.isArray(tempSelection) &&
tempSelection.length === allAvailableOptionStrings.length &&
allAvailableOptionStrings.every((v) => tempSelection.includes(v));
if (wasAllSelected && isAllSelectedAfter) {
setTempSelection(undefined);
return;
}
const hasChanged =
tempSelection !== currentValue &&
!(
Array.isArray(tempSelection) &&
Array.isArray(currentValue) &&
areArraysEqualIgnoreOrder(tempSelection, currentValue)
);
if (hasChanged) {
handleChange(tempSelection);
}
setTempSelection(undefined);
}
// Always reset filtered data when dropdown closes, regardless of tempSelection state
if (!visible) {
setFilteredOptionsData(optionsData);
setRelatedValues(originalRelatedValues);
setApiSearchText('');
}
};
useEffect(
() => (): void => {
// Cleanup on unmount
setTempSelection(undefined);
setFilteredOptionsData([]);
setRelatedValues([]);
setApiSearchText('');
},
[],
);
// eslint-disable-next-line sonarjs/cognitive-complexity
const finalSelectedValues = useMemo(() => {
if (variableData.multiSelect) {
let value = tempSelection || selectedValue;
if (isEmpty(value)) {
if (variableData.showALLOption) {
if (variableData.defaultValue) {
value = variableData.defaultValue;
} else if (variableData.allSelected) {
// If ALL is selected but no stored values, derive from available options
// This handles the case where we don't store values in localStorage for ALL
value = allAvailableOptionStrings;
} else {
value = optionsData;
}
} else if (variableData.defaultValue) {
value = variableData.defaultValue;
} else {
value = optionsData?.[0];
}
}
return value;
}
if (isEmpty(selectedValue)) {
if (variableData.defaultValue) {
return variableData.defaultValue;
}
return optionsData[0]?.toString();
}
return selectedValue;
}, [
variableData.multiSelect,
variableData.showALLOption,
variableData.defaultValue,
variableData.allSelected,
selectedValue,
tempSelection,
optionsData,
allAvailableOptionStrings,
]);
useEffect(() => {
if (
(variableData.multiSelect && !(tempSelection || selectValue)) ||
isEmpty(selectValue)
) {
handleChange(finalSelectedValues as string[] | string);
}
}, [
finalSelectedValues,
handleChange,
selectValue,
tempSelection,
variableData.multiSelect,
]);
return (
<div className="variable-item">
<Typography.Text className="variable-name" ellipsis>
${variableData.name}
{variableData.description && (
<Tooltip title={variableData.description}>
<InfoCircleOutlined className="info-icon" />
</Tooltip>
)}
</Typography.Text>
<div className="variable-value">
{variableData.multiSelect ? (
<CustomMultiSelect
key={variableData.id}
options={getOptionsForDynamicVariable(
filteredOptionsData || [],
relatedValues || [],
)}
defaultValue={variableData.defaultValue}
onChange={handleTempChange}
bordered={false}
placeholder="Select value"
placement="bottomLeft"
style={SelectItemStyle}
loading={isLoading}
showSearch
data-testid="variable-select"
className="variable-select"
popupClassName="dropdown-styles"
maxTagCount={2}
getPopupContainer={popupContainer}
value={
(tempSelection || selectValue) === ALL_SELECT_VALUE
? 'ALL'
: tempSelection || selectValue
}
onDropdownVisibleChange={handleDropdownVisibleChange}
errorMessage={errorMessage}
// eslint-disable-next-line react/no-unstable-nested-components
maxTagPlaceholder={(omittedValues): JSX.Element => {
const maxDisplayValues = 10;
const valuesToShow = omittedValues.slice(0, maxDisplayValues);
const hasMore = omittedValues.length > maxDisplayValues;
const tooltipText =
valuesToShow.map(({ value }) => value).join(', ') +
(hasMore ? ` + ${omittedValues.length - maxDisplayValues} more` : '');
return (
<Tooltip title={tooltipText}>
<span>+ {omittedValues.length} </span>
</Tooltip>
);
}}
onClear={(): void => {
handleChange([]);
}}
enableAllSelection={enableSelectAll}
maxTagTextLength={30}
onSearch={handleSearch}
onRetry={(): void => {
setErrorMessage(null);
setIsRetryableError(true);
refetch();
}}
showIncompleteDataMessage={!isComplete && filteredOptionsData.length > 0}
isDynamicVariable
showRetryButton={isRetryableError}
/>
) : (
<CustomSelect
key={variableData.id}
onChange={handleChange}
bordered={false}
placeholder="Select value"
style={SelectItemStyle}
loading={isLoading}
showSearch
data-testid="variable-select"
className="variable-select"
popupClassName="dropdown-styles"
getPopupContainer={popupContainer}
options={getOptionsForDynamicVariable(
filteredOptionsData || [],
relatedValues || [],
)}
value={selectValue}
defaultValue={variableData.defaultValue}
errorMessage={errorMessage}
onSearch={handleSearch}
// eslint-disable-next-line sonarjs/no-identical-functions
onRetry={(): void => {
setErrorMessage(null);
setIsRetryableError(true);
refetch();
}}
showIncompleteDataMessage={!isComplete && filteredOptionsData.length > 0}
isDynamicVariable
showRetryButton={isRetryableError}
/>
)}
</div>
</div>
);
}
export default DynamicVariableSelection;

View File

@@ -1,13 +1,11 @@
import { memo, useCallback, useState } from 'react';
import { memo, useCallback, useMemo, useState } from 'react';
import { useQuery } from 'react-query';
import { useSelector } from 'react-redux';
import dashboardVariablesQuery from 'api/dashboard/variables/dashboardVariablesQuery';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import sortValues from 'lib/dashbaordVariables/sortVariableValues';
import sortValues from 'lib/dashboardVariables/sortVariableValues';
import { isArray, isString } from 'lodash-es';
import { IDependencyData } from 'providers/Dashboard/store/dashboardVariables/dashboardVariablesStoreTypes';
import { AppState } from 'store/reducers';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import { VariableResponseProps } from 'types/api/dashboard/variables/query';
import { GlobalReducer } from 'types/reducer/globalTime';
@@ -15,20 +13,18 @@ import { variablePropsToPayloadVariables } from '../utils';
import SelectVariableInput from './SelectVariableInput';
import { useDashboardVariableSelectHelper } from './useDashboardVariableSelectHelper';
import { areArraysEqual, checkAPIInvocation } from './util';
import { VariableItemProps } from './VariableItem';
import { queryVariableSelectStrategy } from './variableSelectStrategy/queryVariableSelectStrategy';
interface QueryVariableInputProps {
variableData: IDashboardVariable;
existingVariables: Record<string, IDashboardVariable>;
onValueUpdate: (
name: string,
id: string,
value: IDashboardVariable['selectedValue'],
allSelected: boolean,
) => void;
variablesToGetUpdated: string[];
setVariablesToGetUpdated: React.Dispatch<React.SetStateAction<string[]>>;
dependencyData: IDependencyData | null;
}
type QueryVariableInputProps = Pick<
VariableItemProps,
| 'variableData'
| 'existingVariables'
| 'onValueUpdate'
| 'variablesToGetUpdated'
| 'setVariablesToGetUpdated'
| 'dependencyData'
>;
function QueryVariableInput({
variableData,
@@ -56,13 +52,15 @@ function QueryVariableInput({
onChange,
onDropdownVisibleChange,
handleClear,
applyDefaultIfNeeded,
} = useDashboardVariableSelectHelper({
variableData,
optionsData,
onValueUpdate,
strategy: queryVariableSelectStrategy,
});
const validVariableUpdate = (): boolean => {
const validVariableUpdate = useCallback((): boolean => {
if (!variableData.name) {
return false;
}
@@ -70,86 +68,100 @@ function QueryVariableInput({
variablesToGetUpdated.length &&
variablesToGetUpdated[0] === variableData.name,
);
};
}, [variableData.name, variablesToGetUpdated]);
// eslint-disable-next-line sonarjs/cognitive-complexity
const getOptions = (variablesRes: VariableResponseProps | null): void => {
try {
setErrorMessage(null);
const getOptions = useCallback(
// eslint-disable-next-line sonarjs/cognitive-complexity
(variablesRes: VariableResponseProps | null): void => {
try {
setErrorMessage(null);
if (
variablesRes?.variableValues &&
Array.isArray(variablesRes?.variableValues)
) {
const newOptionsData = sortValues(
variablesRes?.variableValues,
variableData.sort,
);
if (
variablesRes?.variableValues &&
Array.isArray(variablesRes?.variableValues)
) {
const newOptionsData = sortValues(
variablesRes?.variableValues,
variableData.sort,
);
const oldOptionsData = sortValues(optionsData, variableData.sort) as never;
const oldOptionsData = sortValues(optionsData, variableData.sort) as never;
if (!areArraysEqual(newOptionsData, oldOptionsData)) {
let valueNotInList = false;
if (!areArraysEqual(newOptionsData, oldOptionsData)) {
let valueNotInList = false;
if (isArray(variableData.selectedValue)) {
variableData.selectedValue.forEach((val) => {
if (!newOptionsData.includes(val)) {
valueNotInList = true;
}
});
} else if (
isString(variableData.selectedValue) &&
!newOptionsData.includes(variableData.selectedValue)
) {
valueNotInList = true;
}
// variablesData.allSelected is added for the case where on change of options we need to update the
// local storage
if (
variableData.name &&
(validVariableUpdate() || valueNotInList || variableData.allSelected)
) {
if (
variableData.allSelected &&
variableData.multiSelect &&
variableData.showALLOption
if (isArray(variableData.selectedValue)) {
variableData.selectedValue.forEach((val) => {
if (!newOptionsData.includes(val)) {
valueNotInList = true;
}
});
} else if (
isString(variableData.selectedValue) &&
!newOptionsData.includes(variableData.selectedValue)
) {
onValueUpdate(variableData.name, variableData.id, newOptionsData, true);
valueNotInList = true;
}
// Update tempSelection to maintain ALL state when dropdown is open
if (tempSelection !== undefined) {
setTempSelection(newOptionsData.map((option) => option.toString()));
}
} else {
const value = variableData.selectedValue;
let allSelected = false;
// variablesData.allSelected is added for the case where on change of options we need to update the
// local storage
if (
variableData.name &&
(validVariableUpdate() || valueNotInList || variableData.allSelected)
) {
if (
variableData.allSelected &&
variableData.multiSelect &&
variableData.showALLOption
) {
onValueUpdate(variableData.name, variableData.id, newOptionsData, true);
if (variableData.multiSelect) {
const { selectedValue } = variableData;
allSelected =
newOptionsData.length > 0 &&
Array.isArray(selectedValue) &&
newOptionsData.every((option) => selectedValue.includes(option));
}
// Update tempSelection to maintain ALL state when dropdown is open
if (tempSelection !== undefined) {
setTempSelection(newOptionsData.map((option) => option.toString()));
}
} else {
const value = variableData.selectedValue;
let allSelected = false;
if (variableData.name && variableData.id) {
onValueUpdate(variableData.name, variableData.id, value, allSelected);
if (variableData.multiSelect) {
const { selectedValue } = variableData;
allSelected =
newOptionsData.length > 0 &&
Array.isArray(selectedValue) &&
newOptionsData.every((option) => selectedValue.includes(option));
}
if (variableData.name && variableData.id) {
onValueUpdate(variableData.name, variableData.id, value, allSelected);
}
}
}
}
setOptionsData(newOptionsData);
} else {
setVariablesToGetUpdated((prev) =>
prev.filter((name) => name !== variableData.name),
);
setOptionsData(newOptionsData);
// Apply default if no value is selected (e.g., new variable, first load)
applyDefaultIfNeeded(newOptionsData);
} else {
setVariablesToGetUpdated((prev) =>
prev.filter((name) => name !== variableData.name),
);
}
}
} catch (e) {
console.error(e);
}
} catch (e) {
console.error(e);
}
};
},
[
variableData,
optionsData,
onValueUpdate,
tempSelection,
setTempSelection,
validVariableUpdate,
setVariablesToGetUpdated,
applyDefaultIfNeeded,
],
);
const { isLoading, refetch } = useQuery(
[
@@ -162,7 +174,6 @@ function QueryVariableInput({
{
enabled:
variableData &&
variableData.type === 'QUERY' &&
checkAPIInvocation(
variablesToGetUpdated,
variableData,
@@ -207,10 +218,19 @@ function QueryVariableInput({
refetch();
}, [refetch]);
const selectOptions = useMemo(
() =>
optionsData.map((option) => ({
label: option.toString(),
value: option.toString(),
})),
[optionsData],
);
return (
<SelectVariableInput
variableId={variableData.id}
options={optionsData}
options={selectOptions}
value={value}
onChange={onChange}
onDropdownVisibleChange={onDropdownVisibleChange}

View File

@@ -3,6 +3,7 @@ import { orange } from '@ant-design/colors';
import { WarningOutlined } from '@ant-design/icons';
import { Popover, Tooltip, Typography } from 'antd';
import { CustomMultiSelect, CustomSelect } from 'components/NewSelect';
import { OptionData } from 'components/NewSelect/types';
import { popupContainer } from 'utils/selectPopupContainer';
import { ALL_SELECT_VALUE } from '../utils';
@@ -12,7 +13,7 @@ const errorIconStyle = { margin: '0 0.5rem' };
interface SelectVariableInputProps {
variableId: string;
options: (string | number | boolean)[];
options: OptionData[];
value: string | string[] | undefined;
enableSelectAll: boolean;
isMultiSelect: boolean;
@@ -23,13 +24,17 @@ interface SelectVariableInputProps {
loading?: boolean;
errorMessage?: string | null;
onRetry?: () => void;
isDynamicVariable?: boolean;
showRetryButton?: boolean;
showIncompleteDataMessage?: boolean;
onSearch?: (searchTerm: string) => void;
}
const MAX_TAG_DISPLAY_VALUES = 10;
function maxTagPlaceholder(
export const renderMaxTagPlaceholder = (
omittedValues: { label?: React.ReactNode; value?: string | number }[],
): JSX.Element {
): JSX.Element => {
const valuesToShow = omittedValues.slice(0, MAX_TAG_DISPLAY_VALUES);
const hasMore = omittedValues.length > MAX_TAG_DISPLAY_VALUES;
const tooltipText =
@@ -41,7 +46,7 @@ function maxTagPlaceholder(
<span>+ {omittedValues.length} </span>
</Tooltip>
);
}
};
function SelectVariableInput({
variableId,
@@ -56,16 +61,11 @@ function SelectVariableInput({
enableSelectAll,
isMultiSelect,
defaultValue,
isDynamicVariable,
showRetryButton,
showIncompleteDataMessage,
onSearch,
}: SelectVariableInputProps): JSX.Element {
const selectOptions = useMemo(
() =>
options.map((option) => ({
label: option.toString(),
value: option.toString(),
})),
[options],
);
const commonProps = useMemo(
() => ({
// main props
@@ -82,23 +82,33 @@ function SelectVariableInput({
showSearch: true,
bordered: false,
// dynamic props
// changing props
'data-testid': 'variable-select',
onChange,
loading,
options: selectOptions,
options,
errorMessage,
onRetry,
// dynamic variable only props
isDynamicVariable,
showRetryButton,
showIncompleteDataMessage,
onSearch,
}),
[
variableId,
defaultValue,
onChange,
loading,
selectOptions,
options,
value,
errorMessage,
onRetry,
isDynamicVariable,
showRetryButton,
showIncompleteDataMessage,
onSearch,
],
);
@@ -110,11 +120,11 @@ function SelectVariableInput({
placement="bottomLeft"
maxTagCount={2}
onDropdownVisibleChange={onDropdownVisibleChange}
maxTagPlaceholder={maxTagPlaceholder}
maxTagPlaceholder={renderMaxTagPlaceholder}
onClear={onClear}
enableAllSelection={enableSelectAll}
maxTagTextLength={30}
allowClear={value !== ALL_SELECT_VALUE && value !== 'ALL'}
allowClear={value !== ALL_SELECT_VALUE}
/>
) : (
<CustomSelect {...commonProps} />

View File

@@ -5,6 +5,7 @@ import { IDependencyData } from 'providers/Dashboard/store/dashboardVariables/da
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import CustomVariableInput from './CustomVariableInput';
import DynamicVariableInput from './DynamicVariableInput';
import QueryVariableInput from './QueryVariableInput';
import TextboxVariableInput from './TextboxVariableInput';
@@ -16,8 +17,9 @@ export interface VariableItemProps {
onValueUpdate: (
name: string,
id: string,
arg1: IDashboardVariable['selectedValue'],
value: IDashboardVariable['selectedValue'],
allSelected: boolean,
haveCustomValuesSelected?: boolean,
) => void;
variablesToGetUpdated: string[];
setVariablesToGetUpdated: React.Dispatch<React.SetStateAction<string[]>>;
@@ -68,6 +70,13 @@ function VariableItem({
dependencyData={dependencyData}
/>
)}
{variableType === 'DYNAMIC' && (
<DynamicVariableInput
variableData={variableData}
onValueUpdate={onValueUpdate}
existingVariables={existingVariables}
/>
)}
</div>
</div>
);

View File

@@ -5,7 +5,7 @@ import * as ReactRedux from 'react-redux';
import { fireEvent, render, screen } from '@testing-library/react';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import DynamicVariableSelection from '../DynamicVariableSelection';
import DynamicVariableInput from '../DynamicVariableInput';
// Don't mock the components - use real ones
@@ -54,7 +54,7 @@ const mockApiResponse = {
// Mock scrollIntoView since it's not available in JSDOM
window.HTMLElement.prototype.scrollIntoView = jest.fn();
describe('DynamicVariableSelection Component', () => {
describe('DynamicVariableInput Component', () => {
const mockOnValueUpdate = jest.fn();
const mockDynamicVariableData: IDashboardVariable = {
@@ -108,18 +108,13 @@ describe('DynamicVariableSelection Component', () => {
it('renders with single select variable correctly', () => {
render(
<DynamicVariableSelection
<DynamicVariableInput
variableData={mockDynamicVariableData}
existingVariables={mockExistingVariables}
onValueUpdate={mockOnValueUpdate}
/>,
);
// Verify component renders correctly
expect(
screen.getByText(`$${mockDynamicVariableData.name}`),
).toBeInTheDocument();
// Verify the selected value is displayed
const selectedItem = screen.getByRole('combobox');
expect(selectedItem).toBeInTheDocument();
@@ -136,18 +131,13 @@ describe('DynamicVariableSelection Component', () => {
};
render(
<DynamicVariableSelection
<DynamicVariableInput
variableData={multiSelectWithAllSelected}
existingVariables={mockExistingVariables}
onValueUpdate={mockOnValueUpdate}
/>,
);
// Verify variable name is rendered
expect(
screen.getByText(`$${multiSelectWithAllSelected.name}`),
).toBeInTheDocument();
// In ALL selected mode, there should be an "ALL" text element
expect(screen.getByText('ALL')).toBeInTheDocument();
});
@@ -164,18 +154,13 @@ describe('DynamicVariableSelection Component', () => {
});
render(
<DynamicVariableSelection
<DynamicVariableInput
variableData={mockDynamicVariableData}
existingVariables={mockExistingVariables}
onValueUpdate={mockOnValueUpdate}
/>,
);
// Verify component renders in loading state
expect(
screen.getByText(`$${mockDynamicVariableData.name}`),
).toBeInTheDocument();
// Open dropdown to see loading text
const selectElement = screen.getByRole('combobox');
fireEvent.mouseDown(selectElement);
@@ -199,18 +184,13 @@ describe('DynamicVariableSelection Component', () => {
});
render(
<DynamicVariableSelection
<DynamicVariableInput
variableData={mockDynamicVariableData}
existingVariables={mockExistingVariables}
onValueUpdate={mockOnValueUpdate}
/>,
);
// Verify the component renders
expect(
screen.getByText(`$${mockDynamicVariableData.name}`),
).toBeInTheDocument();
// For error states, we should check that error handling is in place
// Without opening the dropdown as the error message might be handled differently
expect(ReactQuery.useQuery).toHaveBeenCalled();
@@ -219,7 +199,7 @@ describe('DynamicVariableSelection Component', () => {
it('makes API call to fetch variable values', () => {
render(
<DynamicVariableSelection
<DynamicVariableInput
variableData={mockDynamicVariableData}
existingVariables={mockExistingVariables}
onValueUpdate={mockOnValueUpdate}
@@ -235,6 +215,8 @@ describe('DynamicVariableSelection Component', () => {
'2023-01-01T00:00:00Z', // minTime from useSelector mock
'2023-01-02T00:00:00Z', // maxTime from useSelector mock
'',
'Traces',
'service.name',
],
expect.objectContaining({
enabled: true, // Type is 'DYNAMIC'
@@ -255,16 +237,13 @@ describe('DynamicVariableSelection Component', () => {
};
render(
<DynamicVariableSelection
<DynamicVariableInput
variableData={customVariable}
existingVariables={{ ...mockExistingVariables, custom1: customVariable }}
onValueUpdate={mockOnValueUpdate}
/>,
);
// Verify the component correctly displays the selected value
expect(screen.getByText(`$${customVariable.name}`)).toBeInTheDocument();
// Find the selection item in the component using data-testid
const selectElement = screen.getByTestId('variable-select');
expect(selectElement).toBeInTheDocument();

View File

@@ -63,10 +63,10 @@ describe('VariableItem Default Value Selection Behavior', () => {
expect(screen.getByTestId(VARIABLE_SELECT_TESTID)).toBeInTheDocument();
});
expect(screen.getByText('option1')).toBeInTheDocument();
expect(await screen.findByText('option1')).toBeInTheDocument();
});
test('should show placeholder when no previous and no default', async () => {
test('should auto-select first option when no previous and no default', async () => {
const variable: IDashboardVariable = {
id: TEST_VARIABLE_ID,
name: TEST_VARIABLE_NAME,
@@ -85,7 +85,8 @@ describe('VariableItem Default Value Selection Behavior', () => {
expect(screen.getByTestId(VARIABLE_SELECT_TESTID)).toBeInTheDocument();
});
expect(screen.getByText('Select value')).toBeInTheDocument();
// With the new variable select strategy, the first option is auto-selected
expect(await screen.findByText('option1')).toBeInTheDocument();
});
});
@@ -110,7 +111,7 @@ describe('VariableItem Default Value Selection Behavior', () => {
expect(screen.getByTestId(VARIABLE_SELECT_TESTID)).toBeInTheDocument();
});
expect(screen.getByText('ALL')).toBeInTheDocument();
expect(await screen.findByText('ALL')).toBeInTheDocument();
});
});
@@ -134,7 +135,7 @@ describe('VariableItem Default Value Selection Behavior', () => {
expect(screen.getByTestId(VARIABLE_SELECT_TESTID)).toBeInTheDocument();
});
expect(screen.getByText('Select value')).toBeInTheDocument();
expect(await screen.findByText('Select value')).toBeInTheDocument();
});
});
});

View File

@@ -1,8 +1,14 @@
import { useCallback, useEffect, useMemo, useState } from 'react';
import { useCallback, useMemo, useState } from 'react';
import { isEmpty } from 'lodash-es';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import { ALL_SELECT_VALUE } from '../utils';
import { areArraysEqual, getSelectValue } from './util';
import { VariableSelectStrategy } from './variableSelectStrategy/variableSelectStrategyTypes';
import {
areArraysEqualIgnoreOrder,
uniqueValues,
} from './variableSelectStrategy/variableSelectStrategyUtils';
interface UseDashboardVariableSelectHelperParams {
variableData: IDashboardVariable;
@@ -12,7 +18,11 @@ interface UseDashboardVariableSelectHelperParams {
id: string,
value: IDashboardVariable['selectedValue'],
allSelected: boolean,
haveCustomValuesSelected?: boolean,
) => void;
strategy: VariableSelectStrategy;
/** Override for all available option strings (default: optionsData.map(String)) */
allAvailableOptionStrings?: string[];
}
interface UseDashboardVariableSelectHelperReturn {
@@ -31,6 +41,11 @@ interface UseDashboardVariableSelectHelperReturn {
onChange: (value: string | string[]) => void;
onDropdownVisibleChange: (visible: boolean) => void;
handleClear: () => void;
// Default value helpers
applyDefaultIfNeeded: (
overrideOptions?: (string | number | boolean)[],
) => void;
}
// eslint-disable-next-line sonarjs/cognitive-complexity
@@ -38,6 +53,8 @@ export function useDashboardVariableSelectHelper({
variableData,
optionsData,
onValueUpdate,
strategy,
allAvailableOptionStrings,
}: UseDashboardVariableSelectHelperParams): UseDashboardVariableSelectHelperReturn {
const { selectedValue } = variableData;
@@ -52,11 +69,37 @@ export function useDashboardVariableSelectHelper({
const enableSelectAll = variableData.multiSelect && variableData.showALLOption;
const effectiveAllAvailableOptionStrings = useMemo(
() => allAvailableOptionStrings ?? optionsData.map((v) => v.toString()),
[allAvailableOptionStrings, optionsData],
);
const selectValue =
variableData.allSelected && enableSelectAll
? 'ALL'
? ALL_SELECT_VALUE
: selectedValueStringified;
const getDefaultValue = useCallback(
(overrideOptions?: (string | number | boolean)[]) => {
const options = overrideOptions || optionsData;
if (variableData.multiSelect) {
if (variableData.showALLOption) {
return variableData.defaultValue || options.map((o) => o.toString());
}
return variableData.defaultValue || options?.[0]?.toString();
}
return variableData.defaultValue || options[0]?.toString();
},
[
variableData.multiSelect,
variableData.showALLOption,
variableData.defaultValue,
optionsData,
],
);
const defaultValue = useMemo(() => getDefaultValue(), [getDefaultValue]);
const handleChange = useCallback(
(inputValue: string | string[]): void => {
const value = variableData.multiSelect && !inputValue ? [] : inputValue;
@@ -69,29 +112,21 @@ export function useDashboardVariableSelectHelper({
) {
return;
}
if (variableData.name) {
// Check if ALL is effectively selected by comparing with available options
const isAllSelected =
Array.isArray(value) &&
value.length > 0 &&
optionsData.every((option) => value.includes(option.toString()));
if (isAllSelected && variableData.showALLOption) {
// For ALL selection, pass optionsData as the value and set allSelected to true
onValueUpdate(variableData.name, variableData.id, optionsData, true);
} else {
onValueUpdate(variableData.name, variableData.id, value, false);
}
}
strategy.handleChange({
value,
variableData,
optionsData,
allAvailableOptionStrings: effectiveAllAvailableOptionStrings,
onValueUpdate,
});
},
[
variableData.multiSelect,
variableData.selectedValue,
variableData.name,
variableData.id,
variableData.showALLOption,
onValueUpdate,
variableData,
optionsData,
effectiveAllAvailableOptionStrings,
onValueUpdate,
strategy,
],
);
@@ -99,79 +134,96 @@ export function useDashboardVariableSelectHelper({
(inputValue: string | string[]): void => {
// Store the selection in temporary state while dropdown is open
const value = variableData.multiSelect && !inputValue ? [] : inputValue;
setTempSelection(value);
setTempSelection(uniqueValues(value));
},
[variableData.multiSelect],
);
// Apply default value on first render if no selection exists
const finalSelectedValues = useMemo(() => {
if (variableData.multiSelect) {
let value = tempSelection || selectedValue;
if (isEmpty(value)) {
if (variableData.showALLOption) {
if (variableData.defaultValue) {
value = variableData.defaultValue;
} else {
value = optionsData;
}
} else if (variableData.defaultValue) {
value = variableData.defaultValue;
} else {
value = optionsData?.[0];
// Single select onChange: apply default if value is empty
const handleSingleSelectChange = useCallback(
(inputValue: string | string[]): void => {
if (isEmpty(inputValue)) {
if (defaultValue !== undefined) {
handleChange(defaultValue as string | string[]);
}
return;
}
return value;
}
if (isEmpty(selectedValue)) {
if (variableData.defaultValue) {
return variableData.defaultValue;
}
return optionsData[0]?.toString();
}
return selectedValue;
}, [
variableData.multiSelect,
variableData.showALLOption,
variableData.defaultValue,
selectedValue,
tempSelection,
optionsData,
]);
// Apply default values when needed
useEffect(() => {
if (
(variableData.multiSelect && !(tempSelection || selectValue)) ||
isEmpty(selectValue)
) {
handleChange(finalSelectedValues as string[] | string);
}
}, [
finalSelectedValues,
handleChange,
selectValue,
tempSelection,
variableData.multiSelect,
]);
handleChange(inputValue);
},
[handleChange, defaultValue],
);
// Handle dropdown visibility changes
const onDropdownVisibleChange = useCallback(
(visible: boolean): void => {
// Initialize temp selection when opening dropdown
if (visible) {
setTempSelection(getSelectValue(variableData.selectedValue, variableData));
if (variableData.allSelected && enableSelectAll) {
// When ALL is selected, show all available options as individually checked
setTempSelection([...effectiveAllAvailableOptionStrings]);
} else {
setTempSelection(getSelectValue(variableData.selectedValue, variableData));
}
}
// Apply changes when closing dropdown
else if (!visible && tempSelection !== undefined) {
// Call handleChange with the temporarily stored selection
handleChange(tempSelection);
// If ALL was selected before AND all options remain selected, skip updating
const wasAllSelected = enableSelectAll && variableData.allSelected;
const isAllSelectedAfter =
enableSelectAll &&
Array.isArray(tempSelection) &&
tempSelection.length === effectiveAllAvailableOptionStrings.length &&
effectiveAllAvailableOptionStrings.every((v) => tempSelection.includes(v));
if (wasAllSelected && isAllSelectedAfter) {
setTempSelection(undefined);
return;
}
// Apply default if closing with empty selection
let valueToApply = tempSelection;
if (isEmpty(tempSelection) && defaultValue !== undefined) {
valueToApply = defaultValue as string | string[];
}
// Order-agnostic change detection
const currentValue = variableData.selectedValue;
const hasChanged =
valueToApply !== currentValue &&
!(
Array.isArray(valueToApply) &&
Array.isArray(currentValue) &&
areArraysEqualIgnoreOrder(valueToApply, currentValue)
);
if (hasChanged) {
handleChange(valueToApply);
}
setTempSelection(undefined);
}
},
[variableData, tempSelection, handleChange],
[
variableData,
enableSelectAll,
effectiveAllAvailableOptionStrings,
tempSelection,
handleChange,
defaultValue,
],
);
// Explicit function for callers to apply default on mount / data load
// Pass overrideOptions when freshly-loaded options aren't in state yet (async callers)
const applyDefaultIfNeeded = useCallback(
(overrideOptions?: (string | number | boolean)[]): void => {
if (isEmpty(selectValue)) {
const defaultValueFromOptions = getDefaultValue(overrideOptions);
if (defaultValueFromOptions !== undefined) {
handleChange(defaultValueFromOptions as string | string[]);
}
}
},
[selectValue, handleChange, getDefaultValue],
);
const handleClear = useCallback((): void => {
@@ -182,11 +234,9 @@ export function useDashboardVariableSelectHelper({
? tempSelection || selectValue
: selectValue;
const defaultValue = variableData.defaultValue || selectValue;
const onChange = useMemo(() => {
return variableData.multiSelect ? handleTempChange : handleChange;
}, [variableData.multiSelect, handleTempChange, handleChange]);
return variableData.multiSelect ? handleTempChange : handleSingleSelectChange;
}, [variableData.multiSelect, handleTempChange, handleSingleSelectChange]);
return {
tempSelection,
@@ -197,5 +247,6 @@ export function useDashboardVariableSelectHelper({
value,
defaultValue,
onChange,
applyDefaultIfNeeded,
};
}

View File

@@ -363,25 +363,6 @@ export const uniqueOptions = (options: OptionData[]): OptionData[] => {
return uniqueOptions;
};
export const uniqueValues = (values: string[] | string): string[] | string => {
if (Array.isArray(values)) {
const uniqueValues: string[] = [];
const seenValues = new Set<string>();
values.forEach((value) => {
if (seenValues.has(value)) {
return;
}
seenValues.add(value);
uniqueValues.push(value);
});
return uniqueValues;
}
return values;
};
export const getSelectValue = (
selectedValue: IDashboardVariable['selectedValue'],
variableData: IDashboardVariable,

View File

@@ -0,0 +1,4 @@
import { defaultVariableSelectStrategy } from './defaultVariableSelectStrategy';
import { VariableSelectStrategy } from './variableSelectStrategyTypes';
export const customVariableSelectStrategy: VariableSelectStrategy = defaultVariableSelectStrategy;

View File

@@ -0,0 +1,20 @@
import { VariableSelectStrategy } from './variableSelectStrategyTypes';
export const defaultVariableSelectStrategy: VariableSelectStrategy = {
handleChange({ value, variableData, optionsData, onValueUpdate }) {
if (!variableData.name) {
return;
}
const isAllSelected =
Array.isArray(value) &&
value.length > 0 &&
optionsData.every((option) => value.includes(option.toString()));
if (isAllSelected && variableData.showALLOption) {
onValueUpdate(variableData.name, variableData.id, optionsData, true);
} else {
onValueUpdate(variableData.name, variableData.id, value, false);
}
},
};

View File

@@ -0,0 +1,37 @@
import { ALL_SELECT_VALUE } from 'container/DashboardContainer/utils';
import { VariableSelectStrategy } from './variableSelectStrategyTypes';
export const dynamicVariableSelectStrategy: VariableSelectStrategy = {
handleChange({
value,
variableData,
allAvailableOptionStrings,
onValueUpdate,
}) {
if (!variableData.name) {
return;
}
if (
value === ALL_SELECT_VALUE ||
(Array.isArray(value) && value.includes(ALL_SELECT_VALUE))
) {
onValueUpdate(variableData.name, variableData.id, null, true);
} else {
// For ALL selection in dynamic variables, pass null to avoid storing values
// The parent component will handle this appropriately
const haveCustomValuesSelected =
Array.isArray(value) &&
!value.every((v) => allAvailableOptionStrings.includes(v.toString()));
onValueUpdate(
variableData.name,
variableData.id,
value,
allAvailableOptionStrings.every((v) => value.includes(v.toString())),
haveCustomValuesSelected,
);
}
},
};

View File

@@ -0,0 +1,4 @@
import { defaultVariableSelectStrategy } from './defaultVariableSelectStrategy';
import { VariableSelectStrategy } from './variableSelectStrategyTypes';
export const queryVariableSelectStrategy: VariableSelectStrategy = defaultVariableSelectStrategy;

View File

@@ -0,0 +1,17 @@
import { IDashboardVariable } from 'types/api/dashboard/getAll';
export interface VariableSelectStrategy {
handleChange(params: {
value: string | string[];
variableData: IDashboardVariable;
optionsData: (string | number | boolean)[];
allAvailableOptionStrings: string[];
onValueUpdate: (
name: string,
id: string,
value: IDashboardVariable['selectedValue'],
allSelected: boolean,
haveCustomValuesSelected?: boolean,
) => void;
}): void;
}

View File

@@ -0,0 +1,32 @@
import { isEqual } from 'lodash-es';
export const areArraysEqualIgnoreOrder = (
a: (string | number | boolean)[],
b: (string | number | boolean)[],
): boolean => {
if (a.length !== b.length) {
return false;
}
const sortedA = [...a].sort();
const sortedB = [...b].sort();
return isEqual(sortedA, sortedB);
};
export const uniqueValues = (values: string[] | string): string[] | string => {
if (Array.isArray(values)) {
const uniqueValues: string[] = [];
const seenValues = new Set<string>();
values.forEach((value) => {
if (seenValues.has(value)) {
return;
}
seenValues.add(value);
uniqueValues.push(value);
});
return uniqueValues;
}
return values;
};

View File

@@ -2,7 +2,7 @@
/* eslint-disable sonarjs/no-duplicate-string */
/* eslint-disable react/jsx-props-no-spreading */
import React from 'react';
import { QueryClient, QueryClientProvider } from 'react-query';
import { QueryClient, QueryClientProvider, useQuery } from 'react-query';
import * as ReactRedux from 'react-redux';
import {
act,
@@ -15,13 +15,22 @@ import {
import { getFieldValues } from 'api/dynamicVariables/getFieldValues';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import DynamicVariableSelection from '../DashboardVariablesSelection/DynamicVariableSelection';
import DynamicVariableInput from '../DashboardVariablesSelection/DynamicVariableInput';
// Mock the getFieldValues API
jest.mock('api/dynamicVariables/getFieldValues', () => ({
getFieldValues: jest.fn(),
}));
// Mock useQuery from react-query
jest.mock('react-query', () => {
const originalModule = jest.requireActual('react-query');
return {
...originalModule,
useQuery: jest.fn(),
};
});
describe('Dynamic Variable Default Behavior', () => {
const mockOnValueUpdate = jest.fn();
const mockApiResponse = {
@@ -59,6 +68,46 @@ describe('Dynamic Variable Default Behavior', () => {
// Mock getFieldValues API to return our test data
(getFieldValues as jest.Mock).mockResolvedValue(mockApiResponse);
// Mock useQuery implementation to avoid infinite re-renders
// and ensure onSuccess is called once
(useQuery as jest.Mock).mockImplementation((key, options) => {
const { onSuccess, enabled, queryFn } = options || {};
const variableName = key[1];
const dynamicVarsKey = key[2];
React.useEffect(() => {
if (enabled !== false) {
if (onSuccess) {
// For 'services' tests:
// 1. "Default to ALL" expectations imply empty options -> [] behavior. This happens when selectedValue is undefined (dynamicVarsKey has 'null').
// 2. "ALL Option Special Value" needs full options to render the "ALL" item in dropdown. This happens when selectedValue is defined.
if (
variableName === 'services' &&
typeof dynamicVarsKey === 'string' &&
dynamicVarsKey.includes('null')
) {
onSuccess({
...mockApiResponse,
data: { ...mockApiResponse.data, normalizedValues: [] },
});
} else {
onSuccess(mockApiResponse);
}
}
if (queryFn) {
queryFn();
}
}
}, [enabled, variableName, dynamicVarsKey]); // Only depend on enabled/keys
return {
isLoading: false,
isError: false,
data: mockApiResponse,
refetch: jest.fn(),
};
});
jest.spyOn(ReactRedux, 'useSelector').mockReturnValue({
minTime: '2023-01-01T00:00:00Z',
maxTime: '2023-01-02T00:00:00Z',
@@ -84,7 +133,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -120,7 +169,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -164,7 +213,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -172,9 +221,6 @@ describe('Dynamic Variable Default Behavior', () => {
);
});
// Component should render without errors
expect(screen.getByText('$service')).toBeInTheDocument();
// Check if the dropdown is present
const selectElement = screen.getByRole('combobox');
expect(selectElement).toBeInTheDocument();
@@ -232,7 +278,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -267,7 +313,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -293,7 +339,7 @@ describe('Dynamic Variable Default Behavior', () => {
expect(screen.queryByText('backend')).not.toBeInTheDocument();
});
it('should default to ALL when no default and no previous selection', async () => {
it('sahould default to ALL when no default and no previous selection', async () => {
const variableData: IDashboardVariable = {
id: 'var21',
name: 'services',
@@ -311,7 +357,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -345,7 +391,7 @@ describe('Dynamic Variable Default Behavior', () => {
expect(mockOnValueUpdate).toHaveBeenCalledWith(
'services',
'var21',
[], // Empty array when allSelected is true
[],
true, // allSelected = true
false,
);
@@ -371,7 +417,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -408,7 +454,7 @@ describe('Dynamic Variable Default Behavior', () => {
await act(async () => {
renderWithQueryClient(
<DynamicVariableSelection
<DynamicVariableInput
variableData={variableData}
existingVariables={{ var1: variableData }}
onValueUpdate={mockOnValueUpdate}
@@ -416,9 +462,6 @@ describe('Dynamic Variable Default Behavior', () => {
);
});
// Component should render without errors
expect(screen.getByText('$services')).toBeInTheDocument();
// Check if ALL is displayed in the UI (in the main selection area)
const allTextElement = screen.getByText('ALL');
expect(allTextElement).toBeInTheDocument();

View File

@@ -0,0 +1,117 @@
import { useCallback, useLayoutEffect, useMemo, useRef } from 'react';
import ChartWrapper from 'container/DashboardContainer/visualization/charts/ChartWrapper/ChartWrapper';
import BarChartTooltip from 'lib/uPlotV2/components/Tooltip/BarChartTooltip';
import {
BarTooltipProps,
TooltipRenderArgs,
} from 'lib/uPlotV2/components/types';
import { has } from 'lodash-es';
import uPlot from 'uplot';
import { BarChartProps } from '../types';
import { stack } from './stackUtils';
/** Returns true if the series at the given index is hidden (e.g. via legend toggle) */
function isSeriesHidden(plot: uPlot, seriesIndex: number): boolean {
return !plot.series[seriesIndex]?.show;
}
export default function BarChart(props: BarChartProps): JSX.Element {
const { children, isStackedBarChart, config, data, ...rest } = props;
// Store unstacked source data so uPlot hooks can access it (hooks run outside React's render cycle)
const unstackedDataRef = useRef<uPlot.AlignedData | null>(null);
unstackedDataRef.current = isStackedBarChart ? data : null;
// Prevents re-entrant calls when we update chart data (avoids infinite loop in setData hook)
const isUpdatingChartRef = useRef(false);
const chartData = useMemo((): uPlot.AlignedData => {
if (!isStackedBarChart || !data || data.length < 2) {
return data as uPlot.AlignedData;
}
const noSeriesHidden = (): boolean => false; // include all series in initial stack
const { data: stacked } = stack(data as uPlot.AlignedData, noSeriesHidden);
return stacked;
}, [data, isStackedBarChart]);
const applyStackingToChart = useCallback((plot: uPlot): void => {
const unstacked = unstackedDataRef.current;
const hasValidData =
unstacked && plot.data && unstacked[0]?.length === plot.data[0]?.length;
if (!hasValidData || isUpdatingChartRef.current) {
return;
}
const shouldExcludeSeries = (idx: number): boolean =>
isSeriesHidden(plot, idx);
const { data: stacked, bands } = stack(unstacked, shouldExcludeSeries);
plot.delBand(null);
bands.forEach((band: uPlot.Band) => plot.addBand(band));
isUpdatingChartRef.current = true;
plot.setData(stacked);
isUpdatingChartRef.current = false;
}, []);
useLayoutEffect(() => {
if (!isStackedBarChart || !config) {
return undefined;
}
const onDataChange = (plot: uPlot): void => {
if (!isUpdatingChartRef.current) {
applyStackingToChart(plot);
}
};
const onSeriesVisibilityChange = (
plot: uPlot,
_seriesIdx: number | null,
opts: uPlot.Series,
): void => {
if (has(opts, 'focus')) {
return;
}
applyStackingToChart(plot);
};
const removeSetDataHook = config.addHook('setData', onDataChange);
const removeSetSeriesHook = config.addHook(
'setSeries',
onSeriesVisibilityChange,
);
return (): void => {
removeSetDataHook?.();
removeSetSeriesHook?.();
};
}, [isStackedBarChart, config, applyStackingToChart]);
const renderTooltip = useCallback(
(props: TooltipRenderArgs): React.ReactNode => {
const tooltipProps: BarTooltipProps = {
...props,
timezone: rest.timezone,
yAxisUnit: rest.yAxisUnit,
decimalPrecision: rest.decimalPrecision,
isStackedBarChart: isStackedBarChart,
};
return <BarChartTooltip {...tooltipProps} />;
},
[rest.timezone, rest.yAxisUnit, rest.decimalPrecision, isStackedBarChart],
);
return (
<ChartWrapper
{...rest}
config={config}
data={chartData}
renderTooltip={renderTooltip}
>
{children}
</ChartWrapper>
);
}

View File

@@ -0,0 +1,57 @@
import uPlot, { AlignedData } from 'uplot';
/**
* Stack data cumulatively (top-down: first series = top, last = bottom).
* When omit(i) returns true for a series index, that series is excluded from stacking.
*/
export function stack(
data: AlignedData,
omit: (seriesIndex: number) => boolean,
): { data: AlignedData; bands: uPlot.Band[] } {
const d0Len = data[0].length;
const n = data.length - 1;
const data2: (number | null)[][] = Array(n);
const accum = Array(d0Len).fill(0) as number[];
// Accumulate from last series upward: last = raw, first = total
for (let i = n; i >= 1; i--) {
const seriesData = data[i] as (number | null)[];
if (omit(i)) {
data2[i - 1] = seriesData;
} else {
data2[i - 1] = seriesData.map((v, idx) => {
const val = v == null ? 0 : Number(v);
return (accum[idx] += val);
});
}
}
const bands: uPlot.Band[] = [];
// Bands: [upper, lower] - fill between consecutive visible series
for (let i = 1; i < data.length; i++) {
if (!omit(i)) {
const nextIdx = data.findIndex((_, j) => j > i && !omit(j));
if (nextIdx >= 1) {
bands.push({ series: [i, nextIdx] });
}
}
}
return {
data: [data[0], ...data2] as AlignedData,
bands,
};
}
/**
* Returns band indices for initial stacked state (no series omitted).
* Top-down: first series at top, band fills between consecutive series.
* uPlot band format: [upperSeriesIdx, lowerSeriesIdx].
*/
export function getInitialStackedBands(seriesCount: number): uPlot.Band[] {
const bands: uPlot.Band[] = [];
for (let i = 1; i < seriesCount; i++) {
bands.push({ series: [i, i + 1] });
}
return bands;
}

View File

@@ -0,0 +1,91 @@
import { useCallback, useRef } from 'react';
import ChartLayout from 'container/DashboardContainer/visualization/layout/ChartLayout/ChartLayout';
import Legend from 'lib/uPlotV2/components/Legend/Legend';
import { LegendPosition } from 'lib/uPlotV2/components/types';
import UPlotChart from 'lib/uPlotV2/components/UPlotChart';
import { PlotContextProvider } from 'lib/uPlotV2/context/PlotContext';
import TooltipPlugin from 'lib/uPlotV2/plugins/TooltipPlugin/TooltipPlugin';
import noop from 'lodash-es/noop';
import uPlot from 'uplot';
import { ChartProps } from '../types';
const TOOLTIP_WIDTH_PADDING = 60;
const TOOLTIP_MIN_WIDTH = 200;
export default function ChartWrapper({
legendConfig = { position: LegendPosition.BOTTOM },
config,
data,
width: containerWidth,
height: containerHeight,
showTooltip = true,
canPinTooltip = false,
syncMode,
syncKey,
onDestroy = noop,
children,
layoutChildren,
renderTooltip,
'data-testid': testId,
}: ChartProps): JSX.Element {
const plotInstanceRef = useRef<uPlot | null>(null);
const legendComponent = useCallback(
(averageLegendWidth: number): React.ReactNode => {
return (
<Legend
config={config}
position={legendConfig.position}
averageLegendWidth={averageLegendWidth}
/>
);
},
[config, legendConfig.position],
);
return (
<PlotContextProvider>
<ChartLayout
config={config}
containerWidth={containerWidth}
containerHeight={containerHeight}
legendConfig={legendConfig}
legendComponent={legendComponent}
layoutChildren={layoutChildren}
>
{({ chartWidth, chartHeight, averageLegendWidth }): JSX.Element => (
<UPlotChart
config={config}
data={data}
width={chartWidth}
height={chartHeight}
plotRef={(plot): void => {
plotInstanceRef.current = plot;
}}
onDestroy={(plot: uPlot): void => {
plotInstanceRef.current = null;
onDestroy(plot);
}}
data-testid={testId}
>
{children}
{showTooltip && (
<TooltipPlugin
config={config}
canPinTooltip={canPinTooltip}
syncMode={syncMode}
maxWidth={Math.max(
TOOLTIP_MIN_WIDTH,
averageLegendWidth + TOOLTIP_WIDTH_PADDING,
)}
syncKey={syncKey}
render={renderTooltip}
/>
)}
</UPlotChart>
)}
</ChartLayout>
</PlotContextProvider>
);
}

View File

@@ -1,104 +1,32 @@
import { useCallback, useRef } from 'react';
import ChartLayout from 'container/DashboardContainer/visualization/layout/ChartLayout/ChartLayout';
import Legend from 'lib/uPlotV2/components/Legend/Legend';
import Tooltip from 'lib/uPlotV2/components/Tooltip/Tooltip';
import { useCallback } from 'react';
import ChartWrapper from 'container/DashboardContainer/visualization/charts/ChartWrapper/ChartWrapper';
import TimeSeriesTooltip from 'lib/uPlotV2/components/Tooltip/TimeSeriesTooltip';
import {
LegendPosition,
TimeSeriesTooltipProps,
TooltipRenderArgs,
} from 'lib/uPlotV2/components/types';
import UPlotChart from 'lib/uPlotV2/components/UPlotChart';
import { PlotContextProvider } from 'lib/uPlotV2/context/PlotContext';
import TooltipPlugin from 'lib/uPlotV2/plugins/TooltipPlugin/TooltipPlugin';
import _noop from 'lodash-es/noop';
import uPlot from 'uplot';
import { ChartProps } from '../types';
import { TimeSeriesChartProps } from '../types';
const TOOLTIP_WIDTH_PADDING = 60;
const TOOLTIP_MIN_WIDTH = 200;
export default function TimeSeries(props: TimeSeriesChartProps): JSX.Element {
const { children, ...rest } = props;
export default function TimeSeries({
legendConfig = { position: LegendPosition.BOTTOM },
config,
data,
width: containerWidth,
height: containerHeight,
disableTooltip = false,
canPinTooltip = false,
timezone,
yAxisUnit,
decimalPrecision,
syncMode,
syncKey,
onDestroy = _noop,
children,
layoutChildren,
'data-testid': testId,
}: ChartProps): JSX.Element {
const plotInstanceRef = useRef<uPlot | null>(null);
const legendComponent = useCallback(
(averageLegendWidth: number): React.ReactNode => {
return (
<Legend
config={config}
position={legendConfig.position}
averageLegendWidth={averageLegendWidth}
/>
);
const renderTooltip = useCallback(
(props: TooltipRenderArgs): React.ReactNode => {
const tooltipProps: TimeSeriesTooltipProps = {
...props,
timezone: rest.timezone,
yAxisUnit: rest.yAxisUnit,
decimalPrecision: rest.decimalPrecision,
};
return <TimeSeriesTooltip {...tooltipProps} />;
},
[config, legendConfig.position],
[rest.timezone, rest.yAxisUnit, rest.decimalPrecision],
);
return (
<PlotContextProvider>
<ChartLayout
config={config}
containerWidth={containerWidth}
containerHeight={containerHeight}
legendConfig={legendConfig}
legendComponent={legendComponent}
layoutChildren={layoutChildren}
>
{({ chartWidth, chartHeight, averageLegendWidth }): JSX.Element => (
<UPlotChart
config={config}
data={data}
width={chartWidth}
height={chartHeight}
plotRef={(plot): void => {
plotInstanceRef.current = plot;
}}
onDestroy={(plot: uPlot): void => {
plotInstanceRef.current = null;
onDestroy(plot);
}}
data-testid={testId}
>
{children}
{!disableTooltip && (
<TooltipPlugin
config={config}
canPinTooltip={canPinTooltip}
syncMode={syncMode}
maxWidth={Math.max(
TOOLTIP_MIN_WIDTH,
averageLegendWidth + TOOLTIP_WIDTH_PADDING,
)}
syncKey={syncKey}
render={(props: TooltipRenderArgs): React.ReactNode => (
<Tooltip
{...props}
timezone={timezone}
yAxisUnit={yAxisUnit}
decimalPrecision={decimalPrecision}
/>
)}
/>
)}
</UPlotChart>
)}
</ChartLayout>
</PlotContextProvider>
<ChartWrapper {...rest} renderTooltip={renderTooltip}>
{children}
</ChartWrapper>
);
}

View File

@@ -1,29 +1,40 @@
import { PrecisionOption } from 'components/Graph/types';
import { LegendConfig } from 'lib/uPlotV2/components/types';
import { LegendConfig, TooltipRenderArgs } from 'lib/uPlotV2/components/types';
import { UPlotConfigBuilder } from 'lib/uPlotV2/config/UPlotConfigBuilder';
import { DashboardCursorSync } from 'lib/uPlotV2/plugins/TooltipPlugin/types';
interface BaseChartProps {
width: number;
height: number;
disableTooltip?: boolean;
showTooltip?: boolean;
timezone: string;
syncMode?: DashboardCursorSync;
syncKey?: string;
canPinTooltip?: boolean;
yAxisUnit?: string;
decimalPrecision?: PrecisionOption;
'data-testid'?: string;
}
interface TimeSeriesChartProps extends BaseChartProps {
interface UPlotBasedChartProps {
config: UPlotConfigBuilder;
legendConfig: LegendConfig;
data: uPlot.AlignedData;
syncMode?: DashboardCursorSync;
syncKey?: string;
plotRef?: (plot: uPlot | null) => void;
onDestroy?: (plot: uPlot) => void;
children?: React.ReactNode;
layoutChildren?: React.ReactNode;
'data-testid'?: string;
}
export type ChartProps = TimeSeriesChartProps;
export interface TimeSeriesChartProps
extends BaseChartProps,
UPlotBasedChartProps {
legendConfig: LegendConfig;
}
export interface BarChartProps extends BaseChartProps, UPlotBasedChartProps {
legendConfig: LegendConfig;
isStackedBarChart?: boolean;
}
export type ChartProps = (TimeSeriesChartProps | BarChartProps) & {
renderTooltip: (props: TooltipRenderArgs) => React.ReactNode;
};

View File

@@ -0,0 +1,154 @@
import { useEffect, useMemo, useRef, useState } from 'react';
import { PanelWrapperProps } from 'container/PanelWrapper/panelWrapper.types';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useResizeObserver } from 'hooks/useDimensions';
import { LegendPosition } from 'lib/uPlotV2/components/types';
import ContextMenu from 'periscope/components/ContextMenu';
import { useDashboard } from 'providers/Dashboard/Dashboard';
import { useTimezone } from 'providers/Timezone';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import uPlot from 'uplot';
import { getTimeRange } from 'utils/getTimeRange';
import BarChart from '../../charts/BarChart/BarChart';
import ChartManager from '../../components/ChartManager/ChartManager';
import { usePanelContextMenu } from '../../hooks/usePanelContextMenu';
import { prepareBarPanelConfig, prepareBarPanelData } from './utils';
function BarPanel(props: PanelWrapperProps): JSX.Element {
const {
panelMode,
queryResponse,
widget,
onDragSelect,
isFullViewMode,
onToggleModelHandler,
} = props;
const uPlotRef = useRef<uPlot | null>(null);
const { toScrollWidgetId, setToScrollWidgetId } = useDashboard();
const graphRef = useRef<HTMLDivElement>(null);
const [minTimeScale, setMinTimeScale] = useState<number>();
const [maxTimeScale, setMaxTimeScale] = useState<number>();
const containerDimensions = useResizeObserver(graphRef);
const isDarkMode = useIsDarkMode();
const { timezone } = useTimezone();
useEffect(() => {
if (toScrollWidgetId === widget.id) {
graphRef.current?.scrollIntoView({
behavior: 'smooth',
block: 'center',
});
graphRef.current?.focus();
setToScrollWidgetId('');
}
}, [toScrollWidgetId, setToScrollWidgetId, widget.id]);
useEffect((): void => {
const { startTime, endTime } = getTimeRange(queryResponse);
setMinTimeScale(startTime);
setMaxTimeScale(endTime);
}, [queryResponse]);
const {
coordinates,
popoverPosition,
onClose,
menuItemsConfig,
clickHandlerWithContextMenu,
} = usePanelContextMenu({
widget,
queryResponse,
});
const config = useMemo(() => {
return prepareBarPanelConfig({
widget,
isDarkMode,
currentQuery: widget.query,
onClick: clickHandlerWithContextMenu,
onDragSelect,
apiResponse: queryResponse?.data?.payload as MetricRangePayloadProps,
timezone,
panelMode,
minTimeScale: minTimeScale,
maxTimeScale: maxTimeScale,
});
}, [
widget,
isDarkMode,
queryResponse?.data?.payload,
clickHandlerWithContextMenu,
onDragSelect,
minTimeScale,
maxTimeScale,
timezone,
panelMode,
]);
const chartData = useMemo(() => {
if (!queryResponse?.data?.payload) {
return [];
}
return prepareBarPanelData(queryResponse?.data?.payload);
}, [queryResponse?.data?.payload]);
const layoutChildren = useMemo(() => {
if (!isFullViewMode) {
return null;
}
return (
<ChartManager
config={config}
alignedData={chartData}
yAxisUnit={widget.yAxisUnit}
onCancel={onToggleModelHandler}
/>
);
}, [
isFullViewMode,
config,
chartData,
widget.yAxisUnit,
onToggleModelHandler,
]);
return (
<div style={{ height: '100%', width: '100%' }} ref={graphRef}>
{containerDimensions.width > 0 && containerDimensions.height > 0 && (
<BarChart
config={config}
legendConfig={{
position: widget?.legendPosition ?? LegendPosition.BOTTOM,
}}
plotRef={(plot: uPlot | null): void => {
uPlotRef.current = plot;
}}
onDestroy={(): void => {
uPlotRef.current = null;
}}
yAxisUnit={widget.yAxisUnit}
decimalPrecision={widget.decimalPrecision}
timezone={timezone.value}
data={chartData as uPlot.AlignedData}
width={containerDimensions.width}
height={containerDimensions.height}
layoutChildren={layoutChildren}
isStackedBarChart={widget.stackedBarChart ?? false}
>
<ContextMenu
coordinates={coordinates}
popoverPosition={popoverPosition}
title={menuItemsConfig.header as string}
items={menuItemsConfig.items}
onClose={onClose}
/>
</BarChart>
)}
</div>
);
}
export default BarPanel;

View File

@@ -0,0 +1,109 @@
import { Timezone } from 'components/CustomTimePicker/timezoneUtils';
import { PANEL_TYPES } from 'constants/queryBuilder';
import { getInitialStackedBands } from 'container/DashboardContainer/visualization/charts/BarChart/stackUtils';
import { getLegend } from 'lib/dashboard/getQueryResults';
import getLabelName from 'lib/getLabelName';
import { OnClickPluginOpts } from 'lib/uPlotLib/plugins/onClickPlugin';
import {
DrawStyle,
LineInterpolation,
LineStyle,
VisibilityMode,
} from 'lib/uPlotV2/config/types';
import { UPlotConfigBuilder } from 'lib/uPlotV2/config/UPlotConfigBuilder';
import { Widgets } from 'types/api/dashboard/getAll';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import { Query } from 'types/api/queryBuilder/queryBuilderData';
import { QueryData } from 'types/api/widgets/getQuery';
import { AlignedData } from 'uplot';
import { PanelMode } from '../types';
import { fillMissingXAxisTimestamps, getXAxisTimestamps } from '../utils';
import { buildBaseConfig } from '../utils/baseConfigBuilder';
export function prepareBarPanelData(
apiResponse: MetricRangePayloadProps,
): AlignedData {
const seriesList = apiResponse?.data?.result || [];
const timestampArr = getXAxisTimestamps(seriesList);
const yAxisValuesArr = fillMissingXAxisTimestamps(timestampArr, seriesList);
return [timestampArr, ...yAxisValuesArr];
}
export function prepareBarPanelConfig({
widget,
isDarkMode,
currentQuery,
onClick,
onDragSelect,
apiResponse,
timezone,
panelMode,
minTimeScale,
maxTimeScale,
}: {
widget: Widgets;
isDarkMode: boolean;
currentQuery: Query;
onClick: OnClickPluginOpts['onClick'];
onDragSelect: (startTime: number, endTime: number) => void;
apiResponse: MetricRangePayloadProps;
timezone: Timezone;
panelMode: PanelMode;
minTimeScale?: number;
maxTimeScale?: number;
}): UPlotConfigBuilder {
const builder = buildBaseConfig({
widget,
isDarkMode,
onClick,
onDragSelect,
apiResponse,
timezone,
panelMode,
panelType: PANEL_TYPES.BAR,
minTimeScale,
maxTimeScale,
});
builder.setCursor({
focus: {
prox: 1e3,
},
});
if (widget.stackedBarChart) {
const seriesCount = (apiResponse?.data?.result?.length ?? 0) + 1; // +1 for 1-based uPlot series indices
builder.setBands(getInitialStackedBands(seriesCount));
builder.addScale({ scaleKey: 'y', softMin: 0 });
}
const seriesList: QueryData[] = apiResponse?.data?.result || [];
seriesList.forEach((series) => {
const baseLabelName = getLabelName(
series.metric,
series.queryName || '', // query
series.legend || '',
);
const label = currentQuery
? getLegend(series, currentQuery, baseLabelName)
: baseLabelName;
builder.addSeries({
scaleKey: 'y',
drawStyle: DrawStyle.Bar,
panelType: PANEL_TYPES.BAR,
label: label,
colorMapping: widget.customLegendColors ?? {},
spanGaps: false,
lineStyle: LineStyle.Solid,
lineInterpolation: LineInterpolation.Spline,
showPoints: VisibilityMode.Never,
pointSize: 5,
isDarkMode,
});
});
return builder;
}

View File

@@ -6,7 +6,6 @@ import { PanelWrapperProps } from 'container/PanelWrapper/panelWrapper.types';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { useResizeObserver } from 'hooks/useDimensions';
import { LegendPosition } from 'lib/uPlotV2/components/types';
import { LineInterpolation } from 'lib/uPlotV2/config/types';
import { ContextMenu } from 'periscope/components/ContextMenu';
import { useDashboard } from 'providers/Dashboard/Dashboard';
import { useTimezone } from 'providers/Timezone';
@@ -73,55 +72,28 @@ function TimeSeriesPanel(props: PanelWrapperProps): JSX.Element {
}, [queryResponse?.data?.payload]);
const config = useMemo(() => {
const tzDate = (timestamp: number): Date =>
uPlot.tzDate(new Date(timestamp * 1e3), timezone.value);
return prepareUPlotConfig({
widgetId: widget.id || '',
apiResponse: queryResponse?.data?.payload as MetricRangePayloadProps,
tzDate,
minTimeScale: minTimeScale,
maxTimeScale: maxTimeScale,
isLogScale: widget?.isLogScale ?? false,
thresholds: {
scaleKey: 'y',
thresholds: (widget.thresholds || []).map((threshold) => ({
thresholdValue: threshold.thresholdValue ?? 0,
thresholdColor: threshold.thresholdColor,
thresholdUnit: threshold.thresholdUnit,
thresholdLabel: threshold.thresholdLabel,
})),
yAxisUnit: widget.yAxisUnit,
},
yAxisUnit: widget.yAxisUnit || '',
softMin: widget.softMin === undefined ? null : widget.softMin,
softMax: widget.softMax === undefined ? null : widget.softMax,
spanGaps: false,
colorMapping: widget.customLegendColors ?? {},
lineInterpolation: LineInterpolation.Spline,
widget,
isDarkMode,
currentQuery: widget.query,
onClick: clickHandlerWithContextMenu,
onDragSelect,
currentQuery: widget.query,
apiResponse: queryResponse?.data?.payload as MetricRangePayloadProps,
timezone,
panelMode,
minTimeScale: minTimeScale,
maxTimeScale: maxTimeScale,
});
}, [
widget.id,
maxTimeScale,
minTimeScale,
timezone.value,
widget.customLegendColors,
widget.isLogScale,
widget.softMax,
widget.softMin,
widget,
isDarkMode,
queryResponse?.data?.payload,
widget.query,
widget.thresholds,
widget.yAxisUnit,
panelMode,
clickHandlerWithContextMenu,
onDragSelect,
queryResponse?.data?.payload,
panelMode,
minTimeScale,
maxTimeScale,
timezone,
]);
const layoutChildren = useMemo(() => {

View File

@@ -1,3 +1,4 @@
import { Timezone } from 'components/CustomTimePicker/timezoneUtils';
import { PANEL_TYPES } from 'constants/queryBuilder';
import {
fillMissingXAxisTimestamps,
@@ -5,23 +6,20 @@ import {
} from 'container/DashboardContainer/visualization/panels/utils';
import { getLegend } from 'lib/dashboard/getQueryResults';
import getLabelName from 'lib/getLabelName';
import onClickPlugin, {
OnClickPluginOpts,
} from 'lib/uPlotLib/plugins/onClickPlugin';
import { OnClickPluginOpts } from 'lib/uPlotLib/plugins/onClickPlugin';
import {
DistributionType,
DrawStyle,
LineInterpolation,
LineStyle,
SelectionPreferencesSource,
VisibilityMode,
} from 'lib/uPlotV2/config/types';
import { UPlotConfigBuilder } from 'lib/uPlotV2/config/UPlotConfigBuilder';
import { ThresholdsDrawHookOptions } from 'lib/uPlotV2/hooks/types';
import { Widgets } from 'types/api/dashboard/getAll';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import { Query } from 'types/api/queryBuilder/queryBuilderData';
import { PanelMode } from '../types';
import { buildBaseConfig } from '../utils/baseConfigBuilder';
export const prepareChartData = (
apiResponse: MetricRangePayloadProps,
@@ -34,112 +32,39 @@ export const prepareChartData = (
};
export const prepareUPlotConfig = ({
widgetId,
apiResponse,
tzDate,
minTimeScale,
maxTimeScale,
isLogScale,
thresholds,
softMin,
softMax,
spanGaps,
colorMapping,
lineInterpolation,
widget,
isDarkMode,
currentQuery,
onDragSelect,
onClick,
yAxisUnit,
onDragSelect,
apiResponse,
timezone,
panelMode,
minTimeScale,
maxTimeScale,
}: {
widgetId: string;
apiResponse: MetricRangePayloadProps;
tzDate: uPlot.LocalDateFromUnix;
minTimeScale: number | undefined;
maxTimeScale: number | undefined;
isLogScale: boolean;
softMin: number | null;
softMax: number | null;
spanGaps: boolean;
colorMapping: Record<string, string>;
lineInterpolation: LineInterpolation;
widget: Widgets;
isDarkMode: boolean;
thresholds: ThresholdsDrawHookOptions;
currentQuery: Query;
yAxisUnit: string;
onClick: OnClickPluginOpts['onClick'];
onDragSelect: (startTime: number, endTime: number) => void;
onClick?: OnClickPluginOpts['onClick'];
apiResponse: MetricRangePayloadProps;
timezone: Timezone;
panelMode: PanelMode;
minTimeScale?: number;
maxTimeScale?: number;
}): UPlotConfigBuilder => {
const builder = new UPlotConfigBuilder({
const builder = buildBaseConfig({
widget,
isDarkMode,
onClick,
onDragSelect,
widgetId,
tzDate,
shouldSaveSelectionPreference: panelMode === PanelMode.DASHBOARD_VIEW,
selectionPreferencesSource: [
PanelMode.DASHBOARD_VIEW,
PanelMode.STANDALONE_VIEW,
].includes(panelMode)
? SelectionPreferencesSource.LOCAL_STORAGE
: SelectionPreferencesSource.IN_MEMORY,
});
// X scale time axis
builder.addScale({
scaleKey: 'x',
time: true,
min: minTimeScale,
max: maxTimeScale,
logBase: isLogScale ? 10 : undefined,
distribution: isLogScale
? DistributionType.Logarithmic
: DistributionType.Linear,
});
// Y scale value axis, driven primarily by softMin/softMax and data
builder.addScale({
scaleKey: 'y',
time: false,
min: undefined,
max: undefined,
softMin: softMin ?? undefined,
softMax: softMax ?? undefined,
thresholds,
logBase: isLogScale ? 10 : undefined,
distribution: isLogScale
? DistributionType.Logarithmic
: DistributionType.Linear,
});
builder.addThresholds(thresholds);
if (typeof onClick === 'function') {
builder.addPlugin(
onClickPlugin({
onClick,
apiResponse,
}),
);
}
builder.addAxis({
scaleKey: 'x',
show: true,
side: 2,
isDarkMode,
isLogScale: false,
panelType: PANEL_TYPES.TIME_SERIES,
});
builder.addAxis({
scaleKey: 'y',
show: true,
side: 3,
isDarkMode,
isLogScale: false,
yAxisUnit,
apiResponse,
timezone,
panelMode,
panelType: PANEL_TYPES.TIME_SERIES,
minTimeScale,
maxTimeScale,
});
apiResponse.data?.result?.forEach((series) => {
@@ -157,14 +82,16 @@ export const prepareUPlotConfig = ({
scaleKey: 'y',
drawStyle: DrawStyle.Line,
label: label,
colorMapping,
spanGaps,
colorMapping: widget.customLegendColors ?? {},
spanGaps: false,
lineStyle: LineStyle.Solid,
lineInterpolation,
lineInterpolation: LineInterpolation.Spline,
showPoints: VisibilityMode.Never,
pointSize: 5,
isDarkMode,
panelType: PANEL_TYPES.TIME_SERIES,
});
});
return builder;
};

View File

@@ -0,0 +1,127 @@
import { Timezone } from 'components/CustomTimePicker/timezoneUtils';
import { PANEL_TYPES } from 'constants/queryBuilder';
import onClickPlugin, {
OnClickPluginOpts,
} from 'lib/uPlotLib/plugins/onClickPlugin';
import {
DistributionType,
SelectionPreferencesSource,
} from 'lib/uPlotV2/config/types';
import { UPlotConfigBuilder } from 'lib/uPlotV2/config/UPlotConfigBuilder';
import { ThresholdsDrawHookOptions } from 'lib/uPlotV2/hooks/types';
import { Widgets } from 'types/api/dashboard/getAll';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';
import uPlot from 'uplot';
import { PanelMode } from '../types';
export interface BaseConfigBuilderProps {
widget: Widgets;
apiResponse: MetricRangePayloadProps;
isDarkMode: boolean;
onClick: OnClickPluginOpts['onClick'];
onDragSelect: (startTime: number, endTime: number) => void;
timezone: Timezone;
panelMode: PanelMode;
panelType: PANEL_TYPES;
minTimeScale?: number;
maxTimeScale?: number;
}
export function buildBaseConfig({
widget,
isDarkMode,
onClick,
onDragSelect,
apiResponse,
timezone,
panelMode,
panelType,
minTimeScale,
maxTimeScale,
}: BaseConfigBuilderProps): UPlotConfigBuilder {
const tzDate = (timestamp: number): Date =>
uPlot.tzDate(new Date(timestamp * 1e3), timezone.value);
const builder = new UPlotConfigBuilder({
onDragSelect,
widgetId: widget.id,
tzDate,
shouldSaveSelectionPreference: panelMode === PanelMode.DASHBOARD_VIEW,
selectionPreferencesSource: [
PanelMode.DASHBOARD_VIEW,
PanelMode.STANDALONE_VIEW,
].includes(panelMode)
? SelectionPreferencesSource.LOCAL_STORAGE
: SelectionPreferencesSource.IN_MEMORY,
});
const thresholdOptions: ThresholdsDrawHookOptions = {
scaleKey: 'y',
thresholds: (widget.thresholds || []).map((threshold) => ({
thresholdValue: threshold.thresholdValue ?? 0,
thresholdColor: threshold.thresholdColor,
thresholdUnit: threshold.thresholdUnit,
thresholdLabel: threshold.thresholdLabel,
})),
yAxisUnit: widget.yAxisUnit,
};
builder.addThresholds(thresholdOptions);
builder.addScale({
scaleKey: 'x',
time: true,
min: minTimeScale,
max: maxTimeScale,
logBase: widget.isLogScale ? 10 : undefined,
distribution: widget.isLogScale
? DistributionType.Logarithmic
: DistributionType.Linear,
});
// Y scale value axis, driven primarily by softMin/softMax and data
builder.addScale({
scaleKey: 'y',
time: false,
min: undefined,
max: undefined,
softMin: widget.softMin ?? undefined,
softMax: widget.softMax ?? undefined,
// thresholds,
logBase: widget.isLogScale ? 10 : undefined,
distribution: widget.isLogScale
? DistributionType.Logarithmic
: DistributionType.Linear,
});
if (typeof onClick === 'function') {
builder.addPlugin(
onClickPlugin({
onClick,
apiResponse,
}),
);
}
builder.addAxis({
scaleKey: 'x',
show: true,
side: 2,
isDarkMode,
isLogScale: widget.isLogScale,
panelType,
});
builder.addAxis({
scaleKey: 'y',
show: true,
side: 3,
isDarkMode,
isLogScale: widget.isLogScale,
yAxisUnit: widget.yAxisUnit,
panelType,
});
return builder;
}

View File

@@ -33,8 +33,8 @@ import { useChartMutable } from 'hooks/useChartMutable';
import useComponentPermission from 'hooks/useComponentPermission';
import { useSafeNavigate } from 'hooks/useSafeNavigate';
import useUrlQuery from 'hooks/useUrlQuery';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { GetQueryResultsProps } from 'lib/dashboard/getQueryResults';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import GetMinMax from 'lib/getMinMax';
import { isEmpty } from 'lodash-es';
import { useAppContext } from 'providers/App/App';

View File

@@ -8,8 +8,8 @@ import { populateMultipleResults } from 'container/NewWidget/LeftContainer/Widge
import { CustomTimeType } from 'container/TopNav/DateTimeSelectionV2/types';
import { useGetQueryRange } from 'hooks/queryBuilder/useGetQueryRange';
import { useIntersectionObserver } from 'hooks/useIntersectionObserver';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { GetQueryResultsProps } from 'lib/dashboard/getQueryResults';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import getTimeString from 'lib/getTimeString';
import { isEqual } from 'lodash-es';
import isEmpty from 'lodash-es/isEmpty';

View File

@@ -6,7 +6,7 @@ import { prepareQueryRangePayloadV5 } from 'api/v5/v5';
import { PANEL_TYPES } from 'constants/queryBuilder';
import { timePreferenceType } from 'container/NewWidget/RightContainer/timeItems';
import { useDashboardVariablesByType } from 'hooks/dashboard/useDashboardVariablesByType';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import { mapQueryDataFromApi } from 'lib/newQueryBuilder/queryBuilderMappers/mapQueryDataFromApi';
import { AppState } from 'store/reducers';
import { Query } from 'types/api/queryBuilder/queryBuilderData';

View File

@@ -103,7 +103,7 @@ export const getTotalLogSizeWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.SUM,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -140,7 +140,7 @@ export const getTotalTraceSizeWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.SUM,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -177,7 +177,7 @@ export const getTotalMetricDatapointCountWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.SUM,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -214,7 +214,7 @@ export const getLogCountWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.AVG,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -251,7 +251,7 @@ export const getLogSizeWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.AVG,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -288,7 +288,7 @@ export const getSpanCountWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.AVG,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -325,7 +325,7 @@ export const getSpanSizeWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.AVG,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],
@@ -362,7 +362,7 @@ export const getMetricCountWidgetData = (): Widgets =>
queryName: 'A',
reduceTo: ReduceOperators.AVG,
spaceAggregation: 'sum',
stepInterval: 60,
stepInterval: null,
timeAggregation: 'increase',
},
],

View File

@@ -27,8 +27,8 @@ import { useIsDarkMode } from 'hooks/useDarkMode';
import { useSafeNavigate } from 'hooks/useSafeNavigate';
import useUrlQuery from 'hooks/useUrlQuery';
import createQueryParams from 'lib/createQueryParams';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { GetQueryResultsProps } from 'lib/dashboard/getQueryResults';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import { cloneDeep, defaultTo, isEmpty, isUndefined } from 'lodash-es';
import { Check, X } from 'lucide-react';
import { DashboardWidgetPageParams } from 'pages/DashboardWidget';

View File

@@ -323,7 +323,6 @@
"java instrumentation",
"java k8s",
"java logs",
"java metrics",
"java microservices",
"java monitoring",
"java observability",
@@ -332,10 +331,12 @@
"java vm",
"java windows",
"jboss",
"wildfly",
"jdk 11",
"jdk 17",
"jdk 21",
"jdk 8",
"jdbc",
"opentelemetry java",
"quarkus",
"spring boot",
@@ -372,6 +373,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-springboot/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-springboot/"
},
{
"key": "windows",
"label": "Windows",
@@ -403,6 +410,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-tomcat/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-tomcat/"
},
{
"key": "windows",
"label": "Windows",
@@ -414,7 +427,7 @@
},
{
"key": "jboss",
"label": "JBoss",
"label": "JBoss/WildFly",
"imgUrl": "/Logos/jboss.svg",
"link": "/docs/instrumentation/opentelemetry-jboss/",
"question": {
@@ -434,6 +447,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-jboss/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-jboss/"
},
{
"key": "windows",
"label": "Windows",
@@ -465,6 +484,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-quarkus/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-quarkus/"
},
{
"key": "windows",
"label": "Windows",
@@ -495,6 +520,18 @@
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-java/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-java/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-java/"
}
]
}
@@ -530,7 +567,6 @@
"python in containers",
"python instrumentation",
"python k8s",
"python metrics",
"python microservices",
"python observability",
"python on kubernetes",
@@ -583,7 +619,6 @@
],
"module": "apm",
"relatedSearchKeywords": [
"angular",
"apm",
"apm/traces",
"application performance monitoring",
@@ -595,7 +630,6 @@
"next.js",
"nodejs",
"nuxtjs",
"reactjs",
"traces",
"tracing",
"ts",
@@ -610,8 +644,8 @@
"entityID": "framework",
"options": [
{
"key": "nodejs",
"label": "NodeJs",
"key": "nodejs-nestjs",
"label": "NodeJs/NestJS",
"imgUrl": "/Logos/nodejs.svg",
"link": "/docs/instrumentation/opentelemetry-javascript/",
"question": {
@@ -631,6 +665,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-javascript/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-javascript/"
},
{
"key": "windows",
"label": "Windows",
@@ -640,99 +680,6 @@
]
}
},
{
"key": "express",
"label": "Express",
"imgUrl": "/Logos/express.svg",
"link": "/docs/instrumentation/opentelemetry-express/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-express/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-express/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-express/"
}
]
}
},
{
"key": "nestjs",
"label": "NestJS",
"imgUrl": "/Logos/nestjs.svg",
"link": "/docs/instrumentation/opentelemetry-nestjs/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-nestjs/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-nestjs/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-nestjs/"
}
]
}
},
{
"key": "angular",
"label": "Angular",
"imgUrl": "/Logos/angular.svg",
"link": "/docs/instrumentation/opentelemetry-angular/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-angular/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-angular/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-angular/"
}
]
}
},
{
"key": "nextjs",
"label": "NextJS",
@@ -755,6 +702,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-nextjs/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-nextjs/"
},
{
"key": "windows",
"label": "Windows",
@@ -764,37 +717,6 @@
]
}
},
{
"key": "reactjs",
"label": "ReactJS",
"imgUrl": "/Logos/reactjs.svg",
"link": "/docs/instrumentation/opentelemetry-reactjs/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-reactjs/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-reactjs/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-reactjs/"
}
]
}
},
{
"key": "nuxtjs",
"label": "NuxtJS",
@@ -817,6 +739,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-nuxtjs/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-nuxtjs/"
},
{
"key": "windows",
"label": "Windows",
@@ -830,32 +758,7 @@
"key": "react-native",
"label": "React Native",
"imgUrl": "/Logos/reactjs.svg",
"link": "/docs/instrumentation/opentelemetry-react-native/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-react-native/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-react-native/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-react-native/"
}
]
}
"link": "/docs/instrumentation/opentelemetry-react-native/"
}
]
}
@@ -889,7 +792,6 @@
"golang",
"golang apm",
"golang instrumentation",
"golang metrics",
"golang observability",
"golang performance monitoring",
"golang tracing",
@@ -980,69 +882,39 @@
"php-fpm monitoring",
"slim php",
"traces",
"tracing"
"tracing",
"wordpress"
],
"id": "php",
"link": "/docs/instrumentation/opentelemetry-php/",
"question": {
"desc": "Which PHP framework do you use?",
"desc": "What is your Environment?",
"type": "select",
"entityID": "framework",
"entityID": "environment",
"options": [
{
"key": "laravel",
"label": "Laravel",
"imgUrl": "/Logos/laravel.svg",
"link": "/docs/instrumentation/laravel/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/laravel/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/laravel/"
}
]
}
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
},
{
"key": "Others",
"label": "Others",
"imgUrl": "/Logos/php-others.svg",
"link": "/docs/instrumentation/opentelemetry-php/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
}
]
}
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/instrumentation/opentelemetry-php/"
}
]
}
@@ -1117,6 +989,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-dotnet/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-dotnet/"
},
{
"key": "windows",
"label": "Windows",
@@ -1128,7 +1006,7 @@
},
{
"dataSource": "ruby-on-rails",
"label": "Ruby on Rails",
"label": "Ruby",
"imgUrl": "/Logos/ruby-on-rails.svg",
"tags": [
"apm/traces"
@@ -1168,9 +1046,12 @@
"ruby tracing",
"ruby-on-rails",
"traces",
"tracing"
"tracing",
"sinatra",
"sidekiq",
"resque"
],
"id": "ruby-on-rails",
"id": "ruby",
"link": "/docs/instrumentation/opentelemetry-ruby-on-rails/",
"question": {
"desc": "What is your Environment?",
@@ -1189,6 +1070,12 @@
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/instrumentation/opentelemetry-ruby-on-rails/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/instrumentation/opentelemetry-ruby-on-rails/"
},
{
"key": "windows",
"label": "Windows",
@@ -1468,41 +1355,6 @@
"id": "nginx-tracing",
"link": "/docs/instrumentation/opentelemetry-nginx/"
},
{
"dataSource": "opentelemetry-wordpress",
"label": "WordPress",
"imgUrl": "/Logos/wordpress.svg",
"tags": [
"apm/traces"
],
"module": "apm",
"relatedSearchKeywords": [
"apm",
"apm/traces",
"application performance monitoring",
"monitor wordpress site",
"opentelemetry",
"opentelemetry wordpress",
"opentelemetry-wordpress",
"otel",
"otel wordpress",
"traces",
"tracing",
"wordpress",
"wordpress apm",
"wordpress instrumentation",
"wordpress metrics",
"wordpress monitoring",
"wordpress observability",
"wordpress performance",
"wordpress php monitoring",
"wordpress plugin monitoring",
"wordpress to signoz",
"wordpress tracing"
],
"id": "opentelemetry-wordpress",
"link": "/docs/instrumentation/opentelemetry-wordpress/"
},
{
"dataSource": "opentelemetry-cloudflare",
"label": "Cloudflare Tracing",
@@ -1567,6 +1419,25 @@
"id": "opentelemetry-cloudflare-logs",
"link": "/docs/logs-management/send-logs/cloudflare-logs/"
},
{
"dataSource": "convex-logs",
"label": "Convex Logs",
"imgUrl": "/Logos/convex-logo.svg",
"tags": [
"logs"
],
"module": "logs",
"relatedSearchKeywords": [
"convex",
"convex log streaming",
"convex logs",
"convex webhook",
"logging",
"logs"
],
"id": "convex-logs",
"link": "/docs/logs-management/send-logs/convex-log-streams-signoz/"
},
{
"dataSource": "kubernetes-pod-logs",
"label": "Kubernetes Pod Logs",
@@ -4568,6 +4439,7 @@
],
"module": "metrics",
"relatedSearchKeywords": [
"browser metrics",
"core web vitals metrics",
"frontend metrics",
"frontend monitoring",
@@ -4587,7 +4459,7 @@
"svelte",
"sveltekit",
"nextjs",
"next.js"
"next.js"
],
"link": "/docs/frontend-monitoring/web-vitals-with-metrics/"
},
@@ -4622,7 +4494,7 @@
"svelte",
"sveltekit",
"nextjs",
"next.js"
"next.js"
],
"link": "/docs/frontend-monitoring/web-vitals-with-traces/"
},
@@ -4664,7 +4536,7 @@
"svelte",
"sveltekit",
"nextjs",
"next.js"
"next.js"
],
"link": "/docs/frontend-monitoring/document-load/"
},
@@ -5316,23 +5188,6 @@
],
"link": "/docs/metrics-management/mysql-metrics/"
},
{
"dataSource": "jvm",
"label": "JVM",
"imgUrl": "/Logos/java.svg",
"tags": [
"metrics"
],
"module": "metrics",
"relatedSearchKeywords": [
"java virtual machine",
"jvm",
"jvm metrics",
"jvm monitoring",
"runtime"
],
"link": "/docs/tutorial/jvm-metrics/"
},
{
"dataSource": "jmx",
"label": "JMX",
@@ -5348,7 +5203,7 @@
"jmx monitoring",
"runtime"
],
"link": "/docs/tutorial/jmx-metrics/"
"link": "/docs/metrics-management/send-metrics/applications/opentelemetry-java/jmx-metrics/"
},
{
"dataSource": "prometheus-metrics",
@@ -5574,16 +5429,17 @@
"svelte",
"sveltekit",
"nextjs",
"next.js"
"next.js"
],
"link": "/docs/frontend-monitoring/sending-logs-with-opentelemetry/"
},
{
"dataSource": "frontend-traces",
"label": "Frontend Traces",
"label": "Frontend Tracing",
"imgUrl": "/Logos/traces.svg",
"tags": [
"Frontend Monitoring"
"Frontend Monitoring",
"apm/traces"
],
"module": "apm",
"relatedSearchKeywords": [
@@ -5605,7 +5461,7 @@
"svelte",
"sveltekit",
"nextjs",
"next.js"
"next.js"
],
"link": "/docs/frontend-monitoring/sending-traces-with-opentelemetry/"
},
@@ -5636,7 +5492,7 @@
"svelte",
"sveltekit",
"nextjs",
"next.js"
"next.js"
],
"link": "/docs/frontend-monitoring/sending-metrics-with-opentelemetry/"
},
@@ -5683,7 +5539,6 @@
"label": "Golang Metrics",
"imgUrl": "/Logos/go.svg",
"tags": [
"apm/traces",
"metrics"
],
"module": "metrics",
@@ -5737,12 +5592,64 @@
]
}
},
{
"dataSource": "java-metrics",
"label": "Java Metrics",
"imgUrl": "/Logos/java.svg",
"tags": [
"metrics"
],
"module": "metrics",
"relatedSearchKeywords": [
"java",
"java metrics",
"java monitoring",
"java observability",
"jvm metrics",
"metrics",
"opentelemetry java",
"otel java",
"runtime metrics"
],
"id": "java-metrics",
"link": "/docs/metrics-management/send-metrics/applications/opentelemetry-java/",
"question": {
"desc": "What is your Environment?",
"type": "select",
"entityID": "environment",
"options": [
{
"key": "vm",
"label": "VM",
"imgUrl": "/Logos/vm.svg",
"link": "/docs/metrics-management/send-metrics/applications/opentelemetry-java/"
},
{
"key": "k8s",
"label": "Kubernetes",
"imgUrl": "/Logos/kubernetes.svg",
"link": "/docs/metrics-management/send-metrics/applications/opentelemetry-java/"
},
{
"key": "windows",
"label": "Windows",
"imgUrl": "/Logos/windows.svg",
"link": "/docs/metrics-management/send-metrics/applications/opentelemetry-java/"
},
{
"key": "docker",
"label": "Docker",
"imgUrl": "/Logos/docker.svg",
"link": "/docs/metrics-management/send-metrics/applications/opentelemetry-java/"
}
]
}
},
{
"dataSource": "python-metrics",
"label": "Python Metrics",
"imgUrl": "/Logos/python.svg",
"tags": [
"apm/traces",
"metrics"
],
"module": "metrics",
@@ -5796,7 +5703,6 @@
"label": ".NET Metrics",
"imgUrl": "/Logos/dotnet.svg",
"tags": [
"apm/traces",
"metrics"
],
"module": "metrics",
@@ -5853,7 +5759,6 @@
"label": "Node.js Metrics",
"imgUrl": "/Logos/nodejs.svg",
"tags": [
"apm/traces",
"metrics"
],
"module": "metrics",

View File

@@ -1,5 +1,5 @@
/* eslint-disable sonarjs/no-duplicate-string */
import { FilterOutlined } from '@ant-design/icons';
import { FilterOutlined, VerticalAlignTopOutlined } from '@ant-design/icons';
import { Button, Tooltip } from 'antd';
import cx from 'classnames';
import { Atom, Binoculars, SquareMousePointer, Terminal } from 'lucide-react';
@@ -32,6 +32,7 @@ export default function LeftToolbarActions({
<Tooltip title="Show Filters">
<Button onClick={handleFilterVisibilityChange} className="filter-btn">
<FilterOutlined />
<VerticalAlignTopOutlined rotate={90} />
</Button>
</Tooltip>
)}

View File

@@ -7,7 +7,7 @@
align-items: center;
justify-content: center;
box-shadow: none;
width: 32px;
width: 40px;
height: 32px;
margin-right: 12px;
border: 1px solid var(--bg-slate-400);

View File

@@ -95,7 +95,6 @@ function ResourceAttributesFilter({
data-testid="resource-environment-filter"
style={{ minWidth: 200, height: 34 }}
onChange={handleEnvironmentChange}
onBlur={handleBlur}
>
{environments.map((opt) => (
<Select.Option key={opt.value} value={opt.value}>

View File

@@ -1,8 +1,8 @@
import { PANEL_TYPES } from 'constants/queryBuilder';
import { getWidgetQueryBuilder } from 'container/MetricsApplication/MetricsApplication.factory';
import { updateStepInterval } from 'hooks/queryBuilder/useStepInterval';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { GetQueryResultsProps } from 'lib/dashboard/getQueryResults';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import { ServicesList } from 'types/api/metrics/getService';
import { QueryDataV3 } from 'types/api/widgets/getQuery';
import { EQueryType } from 'types/common/dashboard';

View File

@@ -13,7 +13,7 @@ import { MenuItemKeys } from 'container/GridCardLayout/WidgetHeader/contants';
import { useDashboardVariables } from 'hooks/dashboard/useDashboardVariables';
import { useDashboardVariablesByType } from 'hooks/dashboard/useDashboardVariablesByType';
import { useNotifications } from 'hooks/useNotifications';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import { mapQueryDataFromApi } from 'lib/newQueryBuilder/queryBuilderMappers/mapQueryDataFromApi';
import { isEmpty } from 'lodash-es';
import { useDashboard } from 'providers/Dashboard/Dashboard';

View File

@@ -3,8 +3,8 @@ import { useSelector } from 'react-redux';
import { initialQueriesMap } from 'constants/queryBuilder';
import { REACT_QUERY_KEY } from 'constants/reactQueryKeys';
import { useDashboardVariables } from 'hooks/dashboard/useDashboardVariables';
import { getDashboardVariables } from 'lib/dashbaordVariables/getDashboardVariables';
import { GetQueryResultsProps } from 'lib/dashboard/getQueryResults';
import { getDashboardVariables } from 'lib/dashboardVariables/getDashboardVariables';
import { AppState } from 'store/reducers';
import { SuccessResponse } from 'types/api';
import { MetricRangePayloadProps } from 'types/api/metrics/getQueryRange';

View File

@@ -0,0 +1,77 @@
import ROUTES from 'constants/routes';
import { whilelistedKeys } from '../config';
import { mappingWithRoutesAndKeys } from '../utils';
describe('useResourceAttribute config', () => {
describe('whilelistedKeys', () => {
it('should include underscore-notation keys (DOT_METRICS_ENABLED=false)', () => {
expect(whilelistedKeys).toContain('resource_deployment_environment');
expect(whilelistedKeys).toContain('resource_k8s_cluster_name');
expect(whilelistedKeys).toContain('resource_k8s_cluster_namespace');
});
it('should include dot-notation keys (DOT_METRICS_ENABLED=true)', () => {
expect(whilelistedKeys).toContain('resource_deployment.environment');
expect(whilelistedKeys).toContain('resource_k8s.cluster.name');
expect(whilelistedKeys).toContain('resource_k8s.cluster.namespace');
});
});
describe('mappingWithRoutesAndKeys', () => {
const dotNotationFilters = [
{
label: 'deployment.environment',
value: 'resource_deployment.environment',
},
{ label: 'k8s.cluster.name', value: 'resource_k8s.cluster.name' },
{ label: 'k8s.cluster.namespace', value: 'resource_k8s.cluster.namespace' },
];
const underscoreNotationFilters = [
{
label: 'deployment.environment',
value: 'resource_deployment_environment',
},
{ label: 'k8s.cluster.name', value: 'resource_k8s_cluster_name' },
{ label: 'k8s.cluster.namespace', value: 'resource_k8s_cluster_namespace' },
];
const nonWhitelistedFilters = [
{ label: 'host.name', value: 'resource_host_name' },
{ label: 'service.name', value: 'resource_service_name' },
];
it('should keep dot-notation filters on the Service Map route', () => {
const result = mappingWithRoutesAndKeys(
ROUTES.SERVICE_MAP,
dotNotationFilters,
);
expect(result).toHaveLength(3);
expect(result).toEqual(dotNotationFilters);
});
it('should keep underscore-notation filters on the Service Map route', () => {
const result = mappingWithRoutesAndKeys(
ROUTES.SERVICE_MAP,
underscoreNotationFilters,
);
expect(result).toHaveLength(3);
expect(result).toEqual(underscoreNotationFilters);
});
it('should filter out non-whitelisted keys on the Service Map route', () => {
const allFilters = [...dotNotationFilters, ...nonWhitelistedFilters];
const result = mappingWithRoutesAndKeys(ROUTES.SERVICE_MAP, allFilters);
expect(result).toHaveLength(3);
expect(result).toEqual(dotNotationFilters);
});
it('should return all filters on non-Service Map routes', () => {
const allFilters = [...dotNotationFilters, ...nonWhitelistedFilters];
const result = mappingWithRoutesAndKeys('/services', allFilters);
expect(result).toHaveLength(5);
expect(result).toEqual(allFilters);
});
});
});

View File

@@ -1,5 +1,8 @@
export const whilelistedKeys = [
'resource_deployment_environment',
'resource_deployment.environment',
'resource_k8s_cluster_name',
'resource_k8s.cluster.name',
'resource_k8s_cluster_namespace',
'resource_k8s.cluster.namespace',
];

View File

@@ -0,0 +1,31 @@
import { useMemo } from 'react';
import { BarTooltipProps, TooltipContentItem } from '../types';
import Tooltip from './Tooltip';
import { buildTooltipContent } from './utils';
export default function BarChartTooltip(props: BarTooltipProps): JSX.Element {
const content = useMemo(
(): TooltipContentItem[] =>
buildTooltipContent({
data: props.uPlotInstance.data,
series: props.uPlotInstance.series,
dataIndexes: props.dataIndexes,
activeSeriesIndex: props.seriesIndex,
uPlotInstance: props.uPlotInstance,
yAxisUnit: props.yAxisUnit ?? '',
decimalPrecision: props.decimalPrecision,
isStackedBarChart: props.isStackedBarChart,
}),
[
props.uPlotInstance,
props.seriesIndex,
props.dataIndexes,
props.yAxisUnit,
props.decimalPrecision,
props.isStackedBarChart,
],
);
return <Tooltip {...props} content={content} />;
}

View File

@@ -0,0 +1,31 @@
import { useMemo } from 'react';
import { TimeSeriesTooltipProps, TooltipContentItem } from '../types';
import Tooltip from './Tooltip';
import { buildTooltipContent } from './utils';
export default function TimeSeriesTooltip(
props: TimeSeriesTooltipProps,
): JSX.Element {
const content = useMemo(
(): TooltipContentItem[] =>
buildTooltipContent({
data: props.uPlotInstance.data,
series: props.uPlotInstance.series,
dataIndexes: props.dataIndexes,
activeSeriesIndex: props.seriesIndex,
uPlotInstance: props.uPlotInstance,
yAxisUnit: props.yAxisUnit ?? '',
decimalPrecision: props.decimalPrecision,
}),
[
props.uPlotInstance,
props.seriesIndex,
props.dataIndexes,
props.yAxisUnit,
props.decimalPrecision,
],
);
return <Tooltip {...props} content={content} />;
}

View File

@@ -5,8 +5,7 @@ import { DATE_TIME_FORMATS } from 'constants/dateTimeFormats';
import dayjs from 'dayjs';
import { useIsDarkMode } from 'hooks/useDarkMode';
import { TooltipContentItem, TooltipProps } from '../types';
import { buildTooltipContent } from './utils';
import { TooltipProps } from '../types';
import './Tooltip.styles.scss';
@@ -14,14 +13,12 @@ const TOOLTIP_LIST_MAX_HEIGHT = 330;
const TOOLTIP_ITEM_HEIGHT = 38;
export default function Tooltip({
seriesIndex,
dataIndexes,
uPlotInstance,
timezone,
yAxisUnit = '',
decimalPrecision,
content,
}: TooltipProps): JSX.Element {
const isDarkMode = useIsDarkMode();
const headerTitle = useMemo(() => {
const data = uPlotInstance.data;
const cursorIdx = uPlotInstance.cursor.idx;
@@ -33,20 +30,6 @@ export default function Tooltip({
.format(DATE_TIME_FORMATS.MONTH_DATETIME_SECONDS);
}, [timezone, uPlotInstance.data, uPlotInstance.cursor.idx]);
const content = useMemo(
(): TooltipContentItem[] =>
buildTooltipContent({
data: uPlotInstance.data,
series: uPlotInstance.series,
dataIndexes,
activeSeriesIndex: seriesIndex,
uPlotInstance,
yAxisUnit,
decimalPrecision,
}),
[uPlotInstance, seriesIndex, dataIndexes, yAxisUnit, decimalPrecision],
);
return (
<div
className={cx(

View File

@@ -20,6 +20,39 @@ export function resolveSeriesColor(
return FALLBACK_SERIES_COLOR;
}
export function getTooltipBaseValue({
data,
index,
dataIndex,
isStackedBarChart,
series,
}: {
data: AlignedData;
index: number;
dataIndex: number;
isStackedBarChart?: boolean;
series?: Series[];
}): number | null {
let baseValue = data[index][dataIndex] ?? null;
// Top-down stacking (first series at top): raw = stacked[i] - stacked[nextVisible].
// When series are hidden, we must use the next *visible* series, not index+1,
// since hidden series keep raw values and would produce negative/wrong results.
if (isStackedBarChart && baseValue !== null && series) {
let nextVisibleIdx = -1;
for (let j = index + 1; j < series.length; j++) {
if (series[j]?.show) {
nextVisibleIdx = j;
break;
}
}
if (nextVisibleIdx >= 1) {
const nextValue = data[nextVisibleIdx][dataIndex] ?? 0;
baseValue = baseValue - nextValue;
}
}
return baseValue;
}
export function buildTooltipContent({
data,
series,
@@ -28,6 +61,7 @@ export function buildTooltipContent({
uPlotInstance,
yAxisUnit,
decimalPrecision,
isStackedBarChart,
}: {
data: AlignedData;
series: Series[];
@@ -36,6 +70,7 @@ export function buildTooltipContent({
uPlotInstance: uPlot;
yAxisUnit: string;
decimalPrecision?: PrecisionOption;
isStackedBarChart?: boolean;
}): TooltipContentItem[] {
const active: TooltipContentItem[] = [];
const rest: TooltipContentItem[] = [];
@@ -52,23 +87,30 @@ export function buildTooltipContent({
continue;
}
const raw = data[index]?.[dataIndex];
const value = Number(raw);
const displayValue = Number.isNaN(value) ? 0 : value;
const baseValue = getTooltipBaseValue({
data,
index,
dataIndex,
isStackedBarChart,
series,
});
const isActive = index === activeSeriesIndex;
const item: TooltipContentItem = {
label: String(s.label ?? ''),
value: displayValue,
tooltipValue: getToolTipValue(displayValue, yAxisUnit, decimalPrecision),
color: resolveSeriesColor(s.stroke, uPlotInstance, index),
isActive,
};
if (Number.isFinite(baseValue) && baseValue !== null) {
const item: TooltipContentItem = {
label: String(s.label ?? ''),
value: baseValue,
tooltipValue: getToolTipValue(baseValue, yAxisUnit, decimalPrecision),
color: resolveSeriesColor(s.stroke, uPlotInstance, index),
isActive,
};
if (isActive) {
active.push(item);
} else {
rest.push(item);
if (isActive) {
active.push(item);
} else {
rest.push(item);
}
}
}

View File

@@ -0,0 +1,412 @@
import type { ReactNode } from 'react';
import { render, screen } from '@testing-library/react';
import { UPlotConfigBuilder } from 'lib/uPlotV2/config/UPlotConfigBuilder';
import type { AlignedData } from 'uplot';
import { PlotContextProvider } from '../../context/PlotContext';
import UPlotChart from '../UPlotChart';
// ---------------------------------------------------------------------------
// Mocks
// ---------------------------------------------------------------------------
jest.mock(
'container/DashboardContainer/visualization/panels/utils/legendVisibilityUtils',
() => ({
getStoredSeriesVisibility: jest.fn(),
updateSeriesVisibilityToLocalStorage: jest.fn(),
}),
);
jest.mock('@sentry/react', () => ({
ErrorBoundary: ({ children }: { children: ReactNode }): JSX.Element => (
<>{children}</>
),
}));
jest.mock('pages/ErrorBoundaryFallback/ErrorBoundaryFallback', () => ({
__esModule: true,
default: (): JSX.Element => <div>Error Fallback</div>,
}));
interface MockUPlotInstance {
root: HTMLDivElement;
setData: jest.Mock;
setSize: jest.Mock;
destroy: jest.Mock;
}
let instances: MockUPlotInstance[] = [];
const mockUPlotConstructor = jest.fn();
jest.mock('uplot', () => {
function MockUPlot(
opts: Record<string, unknown>,
data: unknown,
target: HTMLElement,
): MockUPlotInstance {
mockUPlotConstructor(opts, data, target);
const rootEl = document.createElement('div');
target.appendChild(rootEl);
const inst: MockUPlotInstance = {
root: rootEl,
setData: jest.fn(),
setSize: jest.fn(),
destroy: jest.fn(),
};
instances.push(inst);
return inst;
}
MockUPlot.paths = {
spline: jest.fn(() => jest.fn()),
bars: jest.fn(() => jest.fn()),
linear: jest.fn(() => jest.fn()),
stepped: jest.fn(() => jest.fn()),
};
MockUPlot.tzDate = jest.fn();
return { __esModule: true, default: MockUPlot };
});
// ---------------------------------------------------------------------------
// Helpers
// ---------------------------------------------------------------------------
const createMockConfig = (): UPlotConfigBuilder => {
return ({
getConfig: jest.fn().mockReturnValue({
series: [{ value: (): string => '' }],
axes: [],
scales: {},
hooks: {},
cursor: {},
}),
getWidgetId: jest.fn().mockReturnValue(undefined),
getShouldSaveSelectionPreference: jest.fn().mockReturnValue(false),
} as unknown) as UPlotConfigBuilder;
};
const validData: AlignedData = [
[1, 2, 3],
[10, 20, 30],
];
const emptyData: AlignedData = [[]];
const Wrapper = ({ children }: { children: ReactNode }): JSX.Element => (
<PlotContextProvider>{children}</PlotContextProvider>
);
// ---------------------------------------------------------------------------
// Tests
// ---------------------------------------------------------------------------
describe('UPlotChart', () => {
beforeEach(() => {
instances = [];
mockUPlotConstructor.mockClear();
});
describe('when data is empty', () => {
it('displays "No Data" message instead of the chart container', () => {
render(
<UPlotChart
config={createMockConfig()}
data={emptyData}
width={600}
height={400}
/>,
{ wrapper: Wrapper },
);
expect(screen.getByText('No Data')).toBeInTheDocument();
expect(screen.queryByTestId('uplot-main-div')).not.toBeInTheDocument();
});
it('sizes the empty-state container to the given width and height', () => {
render(
<UPlotChart
config={createMockConfig()}
data={emptyData}
width={750}
height={350}
/>,
{ wrapper: Wrapper },
);
const noDataContainer = screen
.getByText('No Data')
.closest('.uplot-no-data');
expect(noDataContainer).toHaveStyle({ width: '750px', height: '350px' });
});
it('does not create a uPlot instance', () => {
render(
<UPlotChart
config={createMockConfig()}
data={emptyData}
width={600}
height={400}
/>,
{ wrapper: Wrapper },
);
expect(mockUPlotConstructor).not.toHaveBeenCalled();
});
});
describe('chart container', () => {
it('renders children inside the chart wrapper', () => {
render(
<UPlotChart
config={createMockConfig()}
data={validData}
width={600}
height={400}
>
<div data-testid="tooltip-plugin">Tooltip</div>
</UPlotChart>,
{ wrapper: Wrapper },
);
expect(screen.getByTestId('tooltip-plugin')).toBeInTheDocument();
});
});
describe('plot creation', () => {
it('instantiates uPlot with floored dimensions and the container element', () => {
render(
<UPlotChart
config={createMockConfig()}
data={validData}
width={600.9}
height={400.2}
/>,
{ wrapper: Wrapper },
);
expect(mockUPlotConstructor).toHaveBeenCalledTimes(1);
const [opts, data, target] = mockUPlotConstructor.mock.calls[0];
expect(opts.width).toBe(600);
expect(opts.height).toBe(400);
expect(data).toBe(validData);
expect(target).toBe(screen.getByTestId('uplot-main-div'));
});
it('merges config builder output into the uPlot options', () => {
const config = createMockConfig();
config.getConfig = jest.fn().mockReturnValue({
series: [{ value: (): string => '' }],
axes: [{ scale: 'y' }],
scales: { y: {} },
hooks: {},
cursor: { show: true },
});
render(
<UPlotChart
config={(config as unknown) as UPlotConfigBuilder}
data={validData}
width={500}
height={300}
/>,
{ wrapper: Wrapper },
);
const [opts] = mockUPlotConstructor.mock.calls[0];
expect(opts.width).toBe(500);
expect(opts.height).toBe(300);
expect(opts.axes).toEqual([{ scale: 'y' }]);
expect(opts.cursor).toEqual({ show: true });
});
it('skips creation when width or height is 0', () => {
render(
<UPlotChart
config={createMockConfig()}
data={validData}
width={0}
height={0}
/>,
{ wrapper: Wrapper },
);
expect(mockUPlotConstructor).not.toHaveBeenCalled();
});
});
describe('lifecycle callbacks', () => {
it('invokes plotRef with the uPlot instance after creation', () => {
const plotRef = jest.fn();
render(
<UPlotChart
config={createMockConfig()}
data={validData}
width={600}
height={400}
plotRef={plotRef}
/>,
{ wrapper: Wrapper },
);
expect(plotRef).toHaveBeenCalledTimes(1);
expect(plotRef).toHaveBeenCalledWith(instances[0]);
});
it('destroys the instance and notifies callbacks when data becomes empty', () => {
const plotRef = jest.fn();
const onDestroy = jest.fn();
const config = createMockConfig();
const { rerender } = render(
<UPlotChart
config={config}
data={validData}
width={600}
height={400}
plotRef={plotRef}
onDestroy={onDestroy}
/>,
{ wrapper: Wrapper },
);
const firstInstance = instances[0];
plotRef.mockClear();
rerender(
<UPlotChart
config={config}
data={emptyData}
width={600}
height={400}
plotRef={plotRef}
onDestroy={onDestroy}
/>,
);
expect(onDestroy).toHaveBeenCalledWith(firstInstance);
expect(firstInstance.destroy).toHaveBeenCalled();
expect(plotRef).toHaveBeenCalledWith(null);
expect(screen.getByText('No Data')).toBeInTheDocument();
});
it('destroys the previous instance before creating a new one on config change', () => {
const onDestroy = jest.fn();
const config1 = createMockConfig();
const config2 = createMockConfig();
const { rerender } = render(
<UPlotChart
config={config1}
data={validData}
width={600}
height={400}
onDestroy={onDestroy}
/>,
{ wrapper: Wrapper },
);
const firstInstance = instances[0];
rerender(
<UPlotChart
config={config2}
data={validData}
width={600}
height={400}
onDestroy={onDestroy}
/>,
);
expect(onDestroy).toHaveBeenCalledWith(firstInstance);
expect(firstInstance.destroy).toHaveBeenCalled();
expect(instances).toHaveLength(2);
});
});
describe('prop updates', () => {
it('calls setData without recreating the plot when only data changes', () => {
const config = createMockConfig();
const newData: AlignedData = [
[4, 5, 6],
[40, 50, 60],
];
const { rerender } = render(
<UPlotChart config={config} data={validData} width={600} height={400} />,
{ wrapper: Wrapper },
);
const inst = instances[0];
rerender(
<UPlotChart config={config} data={newData} width={600} height={400} />,
);
expect(inst.setData).toHaveBeenCalledWith(newData);
expect(mockUPlotConstructor).toHaveBeenCalledTimes(1);
});
it('calls setSize with floored values when only dimensions change', () => {
const config = createMockConfig();
const { rerender } = render(
<UPlotChart config={config} data={validData} width={600} height={400} />,
{ wrapper: Wrapper },
);
const instance = instances[0];
rerender(
<UPlotChart
config={config}
data={validData}
width={800.7}
height={500.3}
/>,
);
expect(instance.setSize).toHaveBeenCalledWith({ width: 800, height: 500 });
expect(mockUPlotConstructor).toHaveBeenCalledTimes(1);
});
it('recreates the plot when config changes', () => {
const config1 = createMockConfig();
const config2 = createMockConfig();
const { rerender } = render(
<UPlotChart config={config1} data={validData} width={600} height={400} />,
{ wrapper: Wrapper },
);
rerender(
<UPlotChart config={config2} data={validData} width={600} height={400} />,
);
expect(mockUPlotConstructor).toHaveBeenCalledTimes(2);
});
it('does nothing when all props remain the same', () => {
const config = createMockConfig();
const { rerender } = render(
<UPlotChart config={config} data={validData} width={600} height={400} />,
{ wrapper: Wrapper },
);
const instance = instances[0];
rerender(
<UPlotChart config={config} data={validData} width={600} height={400} />,
);
expect(mockUPlotConstructor).toHaveBeenCalledTimes(1);
expect(instance.setData).not.toHaveBeenCalled();
expect(instance.setSize).not.toHaveBeenCalled();
});
});
});

View File

@@ -59,10 +59,22 @@ export interface TooltipRenderArgs {
viaSync: boolean;
}
export type TooltipProps = TooltipRenderArgs & {
export interface BaseTooltipProps {
timezone: string;
yAxisUnit?: string;
decimalPrecision?: PrecisionOption;
}
export interface TimeSeriesTooltipProps
extends BaseTooltipProps,
TooltipRenderArgs {}
export interface BarTooltipProps extends BaseTooltipProps, TooltipRenderArgs {
isStackedBarChart?: boolean;
}
export type TooltipProps = (TimeSeriesTooltipProps | BarTooltipProps) & {
content: TooltipContentItem[];
};
export enum LegendPosition {

View File

@@ -1,3 +1,4 @@
import { PANEL_TYPES } from 'constants/queryBuilder';
import { themeColors } from 'constants/theme';
import { generateColor } from 'lib/uPlotLib/utils/generateColor';
import uPlot, { Series } from 'uplot';
@@ -39,6 +40,11 @@ export class UPlotSeriesBuilder extends ConfigBuilder<SeriesProps, Series> {
if (lineCap) {
lineConfig.cap = lineCap;
}
if (this.props.panelType === PANEL_TYPES.BAR) {
lineConfig.fill = lineColor;
}
return lineConfig;
}
@@ -213,8 +219,9 @@ function getPathBuilder(
const linearBuilder = pathBuilders.linear;
const splineBuilder = pathBuilders.spline;
const steppedBuilder = pathBuilders.stepped;
const barBuilder = pathBuilders.bars;
if (!linearBuilder || !splineBuilder || !steppedBuilder) {
if (!linearBuilder || !splineBuilder || !steppedBuilder || !barBuilder) {
throw new Error('Required uPlot path builders are not available');
}
@@ -223,9 +230,14 @@ function getPathBuilder(
spline: splineBuilder(),
stepBefore: steppedBuilder({ align: -1 }),
stepAfter: steppedBuilder({ align: 1 }),
bar: barBuilder(),
};
}
if (style === DrawStyle.Bar) {
return builders.bar;
}
if (style === DrawStyle.Line) {
if (lineInterpolation === LineInterpolation.StepBefore) {
return builders.stepBefore;

View File

@@ -110,6 +110,7 @@ export enum LineStyle {
export enum DrawStyle {
Line = 'line',
Points = 'points',
Bar = 'bar',
}
export enum LineInterpolation {
@@ -128,7 +129,7 @@ export enum VisibilityMode {
export interface SeriesProps {
scaleKey: string;
label?: string;
panelType: PANEL_TYPES;
colorMapping: Record<string, string>;
drawStyle: DrawStyle;
pathBuilder?: Series.PathBuilder;

View File

@@ -412,14 +412,16 @@ describe('Dashboard Provider - URL Variables Integration', () => {
});
// Verify dashboard state contains the variables with default values
const dashboardVariables = await screen.findByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
expect(parsedVariables).toHaveProperty('environment');
expect(parsedVariables).toHaveProperty('services');
// Default allSelected values should be preserved
expect(parsedVariables.environment.allSelected).toBe(false);
expect(parsedVariables.services.allSelected).toBe(false);
expect(parsedVariables).toHaveProperty('environment');
expect(parsedVariables).toHaveProperty('services');
// Default allSelected values should be preserved
expect(parsedVariables.environment.allSelected).toBe(false);
expect(parsedVariables.services.allSelected).toBe(false);
});
});
it('should merge URL variables with dashboard data and normalize values correctly', async () => {
@@ -466,16 +468,26 @@ describe('Dashboard Provider - URL Variables Integration', () => {
});
// Verify the dashboard state reflects the normalized URL values
const dashboardVariables = await screen.findByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
// The selectedValue should be updated with normalized URL values
expect(parsedVariables.environment.selectedValue).toBe('development');
expect(parsedVariables.services.selectedValue).toEqual(['db', 'cache']);
// First ensure the variables exist
expect(parsedVariables).toHaveProperty('environment');
expect(parsedVariables).toHaveProperty('services');
// allSelected should be set to false when URL values override
expect(parsedVariables.environment.allSelected).toBe(false);
expect(parsedVariables.services.allSelected).toBe(false);
// Then check their properties
expect(parsedVariables.environment).toHaveProperty('selectedValue');
expect(parsedVariables.services).toHaveProperty('selectedValue');
// The selectedValue should be updated with normalized URL values
expect(parsedVariables.environment.selectedValue).toBe('development');
expect(parsedVariables.services.selectedValue).toEqual(['db', 'cache']);
// allSelected should be set to false when URL values override
expect(parsedVariables.environment.allSelected).toBe(false);
expect(parsedVariables.services.allSelected).toBe(false);
});
});
it('should handle ALL_SELECTED_VALUE from URL and set allSelected correctly', async () => {
@@ -500,8 +512,8 @@ describe('Dashboard Provider - URL Variables Integration', () => {
);
// Verify that allSelected is set to true for the services variable
await waitFor(async () => {
const dashboardVariables = await screen.findByTestId('dashboard-variables');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
expect(parsedVariables.services.allSelected).toBe(true);
@@ -603,8 +615,8 @@ describe('Dashboard Provider - Textbox Variable Backward Compatibility', () => {
});
// Verify that defaultValue is set from textboxValue
await waitFor(async () => {
const dashboardVariables = await screen.findByTestId('dashboard-variables');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
expect(parsedVariables.myTextbox.type).toBe('TEXTBOX');
@@ -648,8 +660,8 @@ describe('Dashboard Provider - Textbox Variable Backward Compatibility', () => {
});
// Verify that existing defaultValue is preserved
await waitFor(async () => {
const dashboardVariables = await screen.findByTestId('dashboard-variables');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
expect(parsedVariables.myTextbox.type).toBe('TEXTBOX');
@@ -694,8 +706,8 @@ describe('Dashboard Provider - Textbox Variable Backward Compatibility', () => {
});
// Verify that defaultValue is set to empty string
await waitFor(async () => {
const dashboardVariables = await screen.findByTestId('dashboard-variables');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
expect(parsedVariables.myTextbox.type).toBe('TEXTBOX');
@@ -739,8 +751,8 @@ describe('Dashboard Provider - Textbox Variable Backward Compatibility', () => {
});
// Verify that defaultValue is NOT set from textboxValue for QUERY type
await waitFor(async () => {
const dashboardVariables = await screen.findByTestId('dashboard-variables');
await waitFor(() => {
const dashboardVariables = screen.getByTestId('dashboard-variables');
const parsedVariables = JSON.parse(dashboardVariables.textContent || '{}');
expect(parsedVariables.myQuery.type).toBe('QUERY');

View File

@@ -1,7 +1,7 @@
import { ALL_SELECTED_VALUE } from 'components/NewSelect/utils';
import { IDashboardVariable } from 'types/api/dashboard/getAll';
import { commaValuesParser } from '../../lib/dashbaordVariables/customCommaValuesParser';
import { commaValuesParser } from '../../lib/dashboardVariables/customCommaValuesParser';
interface UrlVariables {
[key: string]: any;

View File

@@ -18,7 +18,6 @@ import (
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/preference"
"github.com/SigNoz/signoz/pkg/modules/promote"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/modules/session"
"github.com/SigNoz/signoz/pkg/modules/user"
"github.com/SigNoz/signoz/pkg/types"
@@ -43,9 +42,8 @@ type provider struct {
dashboardHandler dashboard.Handler
metricsExplorerHandler metricsexplorer.Handler
gatewayHandler gateway.Handler
roleGetter role.Getter
roleHandler role.Handler
fieldsHandler fields.Handler
authzHandler authz.Handler
}
func NewFactory(
@@ -63,9 +61,8 @@ func NewFactory(
dashboardHandler dashboard.Handler,
metricsExplorerHandler metricsexplorer.Handler,
gatewayHandler gateway.Handler,
roleGetter role.Getter,
roleHandler role.Handler,
fieldsHandler fields.Handler,
authzHandler authz.Handler,
) factory.ProviderFactory[apiserver.APIServer, apiserver.Config] {
return factory.NewProviderFactory(factory.MustNewName("signoz"), func(ctx context.Context, providerSettings factory.ProviderSettings, config apiserver.Config) (apiserver.APIServer, error) {
return newProvider(
@@ -86,9 +83,8 @@ func NewFactory(
dashboardHandler,
metricsExplorerHandler,
gatewayHandler,
roleGetter,
roleHandler,
fieldsHandler,
authzHandler,
)
})
}
@@ -111,9 +107,8 @@ func newProvider(
dashboardHandler dashboard.Handler,
metricsExplorerHandler metricsexplorer.Handler,
gatewayHandler gateway.Handler,
roleGetter role.Getter,
roleHandler role.Handler,
fieldsHandler fields.Handler,
authzHandler authz.Handler,
) (apiserver.APIServer, error) {
settings := factory.NewScopedProviderSettings(providerSettings, "github.com/SigNoz/signoz/pkg/apiserver/signozapiserver")
router := mux.NewRouter().UseEncodedPath()
@@ -134,12 +129,11 @@ func newProvider(
dashboardHandler: dashboardHandler,
metricsExplorerHandler: metricsExplorerHandler,
gatewayHandler: gatewayHandler,
roleGetter: roleGetter,
roleHandler: roleHandler,
fieldsHandler: fieldsHandler,
authzHandler: authzHandler,
}
provider.authZ = middleware.NewAuthZ(settings.Logger(), orgGetter, authz, roleGetter)
provider.authZ = middleware.NewAuthZ(settings.Logger(), orgGetter, authz)
if err := provider.AddToRouter(router); err != nil {
return nil, err

View File

@@ -10,7 +10,7 @@ import (
)
func (provider *provider) addRoleRoutes(router *mux.Router) error {
if err := router.Handle("/api/v1/roles", handler.New(provider.authZ.AdminAccess(provider.roleHandler.Create), handler.OpenAPIDef{
if err := router.Handle("/api/v1/roles", handler.New(provider.authZ.AdminAccess(provider.authzHandler.Create), handler.OpenAPIDef{
ID: "CreateRole",
Tags: []string{"role"},
Summary: "Create role",
@@ -27,7 +27,7 @@ func (provider *provider) addRoleRoutes(router *mux.Router) error {
return err
}
if err := router.Handle("/api/v1/roles", handler.New(provider.authZ.AdminAccess(provider.roleHandler.List), handler.OpenAPIDef{
if err := router.Handle("/api/v1/roles", handler.New(provider.authZ.AdminAccess(provider.authzHandler.List), handler.OpenAPIDef{
ID: "ListRoles",
Tags: []string{"role"},
Summary: "List roles",
@@ -44,7 +44,7 @@ func (provider *provider) addRoleRoutes(router *mux.Router) error {
return err
}
if err := router.Handle("/api/v1/roles/{id}", handler.New(provider.authZ.AdminAccess(provider.roleHandler.Get), handler.OpenAPIDef{
if err := router.Handle("/api/v1/roles/{id}", handler.New(provider.authZ.AdminAccess(provider.authzHandler.Get), handler.OpenAPIDef{
ID: "GetRole",
Tags: []string{"role"},
Summary: "Get role",
@@ -61,7 +61,7 @@ func (provider *provider) addRoleRoutes(router *mux.Router) error {
return err
}
if err := router.Handle("/api/v1/roles/{id}", handler.New(provider.authZ.AdminAccess(provider.roleHandler.Patch), handler.OpenAPIDef{
if err := router.Handle("/api/v1/roles/{id}", handler.New(provider.authZ.AdminAccess(provider.authzHandler.Patch), handler.OpenAPIDef{
ID: "PatchRole",
Tags: []string{"role"},
Summary: "Patch role",
@@ -78,7 +78,7 @@ func (provider *provider) addRoleRoutes(router *mux.Router) error {
return err
}
if err := router.Handle("/api/v1/roles/{id}", handler.New(provider.authZ.AdminAccess(provider.roleHandler.Delete), handler.OpenAPIDef{
if err := router.Handle("/api/v1/roles/{id}", handler.New(provider.authZ.AdminAccess(provider.authzHandler.Delete), handler.OpenAPIDef{
ID: "DeleteRole",
Tags: []string{"role"},
Summary: "Delete role",

View File

@@ -2,9 +2,11 @@ package authz
import (
"context"
"net/http"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
)
@@ -29,4 +31,76 @@ type AuthZ interface {
// Lists the selectors for objects assigned to subject (s) with relation (r) on resource (s)
ListObjects(context.Context, string, authtypes.Relation, authtypes.Typeable) ([]*authtypes.Object, error)
// Creates the role.
Create(context.Context, valuer.UUID, *roletypes.Role) error
// Gets the role if it exists or creates one.
GetOrCreate(context.Context, valuer.UUID, *roletypes.Role) (*roletypes.Role, error)
// Gets the objects associated with the given role and relation.
GetObjects(context.Context, valuer.UUID, valuer.UUID, authtypes.Relation) ([]*authtypes.Object, error)
// Gets all the typeable resources registered from role registry.
GetResources(context.Context) []*authtypes.Resource
// Patches the role.
Patch(context.Context, valuer.UUID, *roletypes.Role) error
// Patches the objects in authorization server associated with the given role and relation
PatchObjects(context.Context, valuer.UUID, string, authtypes.Relation, []*authtypes.Object, []*authtypes.Object) error
// Deletes the role and tuples in authorization server.
Delete(context.Context, valuer.UUID, valuer.UUID) error
// Gets the role
Get(context.Context, valuer.UUID, valuer.UUID) (*roletypes.Role, error)
// Gets the role by org_id and name
GetByOrgIDAndName(context.Context, valuer.UUID, string) (*roletypes.Role, error)
// Lists all the roles for the organization.
List(context.Context, valuer.UUID) ([]*roletypes.Role, error)
// Lists all the roles for the organization filtered by name
ListByOrgIDAndNames(context.Context, valuer.UUID, []string) ([]*roletypes.Role, error)
// Grants a role to the subject based on role name.
Grant(context.Context, valuer.UUID, string, string) error
// Revokes a granted role from the subject based on role name.
Revoke(context.Context, valuer.UUID, string, string) error
// Changes the granted role for the subject based on role name.
ModifyGrant(context.Context, valuer.UUID, string, string, string) error
// Bootstrap the managed roles.
CreateManagedRoles(context.Context, valuer.UUID, []*roletypes.Role) error
// Bootstrap managed roles transactions and user assignments
CreateManagedUserRoleTransactions(context.Context, valuer.UUID, valuer.UUID) error
}
type RegisterTypeable interface {
MustGetTypeables() []authtypes.Typeable
MustGetManagedRoleTransactions() map[string][]*authtypes.Transaction
}
type Handler interface {
Create(http.ResponseWriter, *http.Request)
Get(http.ResponseWriter, *http.Request)
GetObjects(http.ResponseWriter, *http.Request)
GetResources(http.ResponseWriter, *http.Request)
List(http.ResponseWriter, *http.Request)
Patch(http.ResponseWriter, *http.Request)
PatchObjects(http.ResponseWriter, *http.Request)
Delete(http.ResponseWriter, *http.Request)
}

View File

@@ -1,4 +1,4 @@
package implrole
package sqlauthzstore
import (
"context"
@@ -14,7 +14,7 @@ type store struct {
sqlstore sqlstore.SQLStore
}
func NewStore(sqlstore sqlstore.SQLStore) roletypes.Store {
func NewSqlAuthzStore(sqlstore sqlstore.SQLStore) roletypes.Store {
return &store{sqlstore: sqlstore}
}

View File

@@ -2,35 +2,24 @@ package openfgaauthz
import (
"context"
"strconv"
"sync"
authz "github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/authz/authzstore/sqlauthzstore"
"github.com/SigNoz/signoz/pkg/authz/openfgaserver"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/sqlstore"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
openfgapkgtransformer "github.com/openfga/language/pkg/go/transformer"
openfgapkgserver "github.com/openfga/openfga/pkg/server"
"google.golang.org/protobuf/encoding/protojson"
)
var (
openfgaDefaultStore = valuer.NewString("signoz")
)
type provider struct {
config authz.Config
settings factory.ScopedProviderSettings
openfgaSchema []openfgapkgtransformer.ModuleFile
openfgaServer *openfgapkgserver.Server
storeID string
modelID string
mtx sync.RWMutex
stopChan chan struct{}
server *openfgaserver.Server
store roletypes.Store
}
func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) factory.ProviderFactory[authz.AuthZ, authz.Config] {
@@ -40,301 +29,194 @@ func NewProviderFactory(sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtr
}
func newOpenfgaProvider(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) (authz.AuthZ, error) {
scopedProviderSettings := factory.NewScopedProviderSettings(settings, "github.com/SigNoz/signoz/pkg/authz/openfgaauthz")
store, err := NewSQLStore(sqlstore)
server, err := openfgaserver.NewOpenfgaServer(ctx, settings, config, sqlstore, openfgaSchema)
if err != nil {
scopedProviderSettings.Logger().DebugContext(ctx, "failed to initialize sqlstore for authz")
return nil, err
}
// setup the openfga server
opts := []openfgapkgserver.OpenFGAServiceV1Option{
openfgapkgserver.WithDatastore(store),
openfgapkgserver.WithLogger(NewLogger(scopedProviderSettings.Logger())),
openfgapkgserver.WithContextPropagationToDatastore(true),
}
openfgaServer, err := openfgapkgserver.NewServerWithOpts(opts...)
if err != nil {
scopedProviderSettings.Logger().DebugContext(ctx, "failed to create authz server")
return nil, err
}
return &provider{
config: config,
settings: scopedProviderSettings,
openfgaServer: openfgaServer,
openfgaSchema: openfgaSchema,
mtx: sync.RWMutex{},
stopChan: make(chan struct{}),
server: server,
store: sqlauthzstore.NewSqlAuthzStore(sqlstore),
}, nil
}
func (provider *provider) Start(ctx context.Context) error {
storeId, err := provider.getOrCreateStore(ctx, openfgaDefaultStore.StringValue())
if err != nil {
return err
}
modelID, err := provider.getOrCreateModel(ctx, storeId)
if err != nil {
return err
}
provider.mtx.Lock()
provider.modelID = modelID
provider.storeID = storeId
provider.mtx.Unlock()
<-provider.stopChan
return nil
return provider.server.Start(ctx)
}
func (provider *provider) Stop(ctx context.Context) error {
provider.openfgaServer.Close()
close(provider.stopChan)
return nil
return provider.server.Stop(ctx)
}
func (provider *provider) Check(ctx context.Context, tupleReq *openfgav1.TupleKey) error {
storeID, modelID := provider.getStoreIDandModelID()
checkResponse, err := provider.openfgaServer.Check(
ctx,
&openfgav1.CheckRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
TupleKey: &openfgav1.CheckRequestTupleKey{
User: tupleReq.User,
Relation: tupleReq.Relation,
Object: tupleReq.Object,
},
})
if err != nil {
return errors.Newf(errors.TypeInternal, authtypes.ErrCodeAuthZUnavailable, "authorization server is unavailable").WithAdditional(err.Error())
}
if !checkResponse.Allowed {
return errors.Newf(errors.TypeForbidden, authtypes.ErrCodeAuthZForbidden, "subject %s cannot %s object %s", tupleReq.User, tupleReq.Relation, tupleReq.Object)
}
return nil
return provider.server.Check(ctx, tupleReq)
}
func (provider *provider) BatchCheck(ctx context.Context, tupleReq []*openfgav1.TupleKey) error {
storeID, modelID := provider.getStoreIDandModelID()
batchCheckItems := make([]*openfgav1.BatchCheckItem, 0)
for idx, tuple := range tupleReq {
batchCheckItems = append(batchCheckItems, &openfgav1.BatchCheckItem{
TupleKey: &openfgav1.CheckRequestTupleKey{
User: tuple.User,
Relation: tuple.Relation,
Object: tuple.Object,
},
// the batch check response is map[string] keyed by correlationID.
CorrelationId: strconv.Itoa(idx),
})
}
checkResponse, err := provider.openfgaServer.BatchCheck(
ctx,
&openfgav1.BatchCheckRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
Checks: batchCheckItems,
})
if err != nil {
return errors.Newf(errors.TypeInternal, authtypes.ErrCodeAuthZUnavailable, "authorization server is unavailable").WithAdditional(err.Error())
}
for _, checkResponse := range checkResponse.Result {
if checkResponse.GetAllowed() {
return nil
}
}
return errors.Newf(errors.TypeForbidden, authtypes.ErrCodeAuthZForbidden, "subjects are not authorized for requested access")
return provider.server.BatchCheck(ctx, tupleReq)
}
func (provider *provider) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, _ authtypes.Relation, _ authtypes.Typeable, _ []authtypes.Selector, roleSelectors []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableUser, claims.UserID, orgID, nil)
if err != nil {
return err
}
tuples, err := authtypes.TypeableRole.Tuples(subject, authtypes.RelationAssignee, roleSelectors, orgID)
if err != nil {
return err
}
err = provider.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
func (provider *provider) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, roleSelectors []authtypes.Selector) error {
return provider.server.CheckWithTupleCreation(ctx, claims, orgID, relation, typeable, selectors, roleSelectors)
}
func (provider *provider) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, _ authtypes.Relation, _ authtypes.Typeable, _ []authtypes.Selector, roleSelectors []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.String(), orgID, nil)
if err != nil {
return err
}
tuples, err := authtypes.TypeableRole.Tuples(subject, authtypes.RelationAssignee, roleSelectors, orgID)
if err != nil {
return err
}
err = provider.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
func (provider *provider) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, relation authtypes.Relation, typeable authtypes.Typeable, selectors []authtypes.Selector, roleSelectors []authtypes.Selector) error {
return provider.server.CheckWithTupleCreationWithoutClaims(ctx, orgID, relation, typeable, selectors, roleSelectors)
}
func (provider *provider) Write(ctx context.Context, additions []*openfgav1.TupleKey, deletions []*openfgav1.TupleKey) error {
if len(additions) == 0 && len(deletions) == 0 {
return nil
}
storeID, modelID := provider.getStoreIDandModelID()
deletionTuplesWithoutCondition := make([]*openfgav1.TupleKeyWithoutCondition, len(deletions))
for idx, tuple := range deletions {
deletionTuplesWithoutCondition[idx] = &openfgav1.TupleKeyWithoutCondition{User: tuple.User, Object: tuple.Object, Relation: tuple.Relation}
}
_, err := provider.openfgaServer.Write(ctx, &openfgav1.WriteRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
Writes: func() *openfgav1.WriteRequestWrites {
if len(additions) == 0 {
return nil
}
return &openfgav1.WriteRequestWrites{
TupleKeys: additions,
OnDuplicate: "ignore",
}
}(),
Deletes: func() *openfgav1.WriteRequestDeletes {
if len(deletionTuplesWithoutCondition) == 0 {
return nil
}
return &openfgav1.WriteRequestDeletes{
TupleKeys: deletionTuplesWithoutCondition,
OnMissing: "ignore",
}
}(),
})
return err
return provider.server.Write(ctx, additions, deletions)
}
func (provider *provider) ListObjects(ctx context.Context, subject string, relation authtypes.Relation, typeable authtypes.Typeable) ([]*authtypes.Object, error) {
storeID, modelID := provider.getStoreIDandModelID()
response, err := provider.openfgaServer.ListObjects(ctx, &openfgav1.ListObjectsRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
User: subject,
Relation: relation.StringValue(),
Type: typeable.Type().StringValue(),
return provider.server.ListObjects(ctx, subject, relation, typeable)
}
func (provider *provider) Get(ctx context.Context, orgID valuer.UUID, id valuer.UUID) (*roletypes.Role, error) {
storableRole, err := provider.store.Get(ctx, orgID, id)
if err != nil {
return nil, err
}
return roletypes.NewRoleFromStorableRole(storableRole), nil
}
func (provider *provider) GetByOrgIDAndName(ctx context.Context, orgID valuer.UUID, name string) (*roletypes.Role, error) {
storableRole, err := provider.store.GetByOrgIDAndName(ctx, orgID, name)
if err != nil {
return nil, err
}
return roletypes.NewRoleFromStorableRole(storableRole), nil
}
func (provider *provider) List(ctx context.Context, orgID valuer.UUID) ([]*roletypes.Role, error) {
storableRoles, err := provider.store.List(ctx, orgID)
if err != nil {
return nil, err
}
roles := make([]*roletypes.Role, len(storableRoles))
for idx, storableRole := range storableRoles {
roles[idx] = roletypes.NewRoleFromStorableRole(storableRole)
}
return roles, nil
}
func (provider *provider) ListByOrgIDAndNames(ctx context.Context, orgID valuer.UUID, names []string) ([]*roletypes.Role, error) {
storableRoles, err := provider.store.ListByOrgIDAndNames(ctx, orgID, names)
if err != nil {
return nil, err
}
roles := make([]*roletypes.Role, len(storableRoles))
for idx, storable := range storableRoles {
roles[idx] = roletypes.NewRoleFromStorableRole(storable)
}
return roles, nil
}
func (provider *provider) Grant(ctx context.Context, orgID valuer.UUID, name string, subject string) error {
tuples, err := authtypes.TypeableRole.Tuples(
subject,
authtypes.RelationAssignee,
[]authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeRole, name),
},
orgID,
)
if err != nil {
return err
}
return provider.Write(ctx, tuples, nil)
}
func (provider *provider) ModifyGrant(ctx context.Context, orgID valuer.UUID, existingRoleName string, updatedRoleName string, subject string) error {
err := provider.Revoke(ctx, orgID, existingRoleName, subject)
if err != nil {
return err
}
err = provider.Grant(ctx, orgID, updatedRoleName, subject)
if err != nil {
return err
}
return nil
}
func (provider *provider) Revoke(ctx context.Context, orgID valuer.UUID, name string, subject string) error {
tuples, err := authtypes.TypeableRole.Tuples(
subject,
authtypes.RelationAssignee,
[]authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeRole, name),
},
orgID,
)
if err != nil {
return err
}
return provider.Write(ctx, nil, tuples)
}
func (provider *provider) CreateManagedRoles(ctx context.Context, _ valuer.UUID, managedRoles []*roletypes.Role) error {
err := provider.store.RunInTx(ctx, func(ctx context.Context) error {
for _, role := range managedRoles {
err := provider.store.Create(ctx, roletypes.NewStorableRoleFromRole(role))
if err != nil {
return err
}
}
return nil
})
if err != nil {
return nil, errors.Wrapf(err, errors.TypeInternal, authtypes.ErrCodeAuthZUnavailable, "cannot list objects for subject %s with relation %s for type %s", subject, relation.StringValue(), typeable.Type().StringValue())
return err
}
return authtypes.MustNewObjectsFromStringSlice(response.Objects), nil
return nil
}
func (provider *provider) getOrCreateStore(ctx context.Context, name string) (string, error) {
stores, err := provider.openfgaServer.ListStores(ctx, &openfgav1.ListStoresRequest{})
if err != nil {
return "", err
}
for _, store := range stores.GetStores() {
if store.GetName() == name {
return store.Id, nil
}
}
store, err := provider.openfgaServer.CreateStore(ctx, &openfgav1.CreateStoreRequest{Name: name})
if err != nil {
return "", err
}
return store.Id, nil
func (provider *provider) SetManagedRoleTransactions(context.Context, valuer.UUID) error {
return nil
}
func (provider *provider) getOrCreateModel(ctx context.Context, storeID string) (string, error) {
schema, err := openfgapkgtransformer.TransformModuleFilesToModel(provider.openfgaSchema, "1.1")
if err != nil {
return "", err
}
authorisationModels, err := provider.openfgaServer.ReadAuthorizationModels(ctx, &openfgav1.ReadAuthorizationModelsRequest{StoreId: storeID})
if err != nil {
return "", err
}
for _, authModel := range authorisationModels.GetAuthorizationModels() {
equal, err := provider.isModelEqual(schema, authModel)
if err != nil {
return "", err
}
if equal {
return authModel.Id, nil
}
}
authorizationModel, err := provider.openfgaServer.WriteAuthorizationModel(ctx, &openfgav1.WriteAuthorizationModelRequest{
StoreId: storeID,
TypeDefinitions: schema.TypeDefinitions,
SchemaVersion: schema.SchemaVersion,
Conditions: schema.Conditions,
})
if err != nil {
return "", err
}
return authorizationModel.AuthorizationModelId, nil
func (provider *provider) CreateManagedUserRoleTransactions(ctx context.Context, orgID valuer.UUID, userID valuer.UUID) error {
return provider.Grant(ctx, orgID, roletypes.SigNozAdminRoleName, authtypes.MustNewSubject(authtypes.TypeableUser, userID.String(), orgID, nil))
}
// the language model doesn't have any equality check
// https://github.com/openfga/language/blob/main/pkg/go/transformer/module-to-model_test.go#L38
func (provider *provider) isModelEqual(expected *openfgav1.AuthorizationModel, actual *openfgav1.AuthorizationModel) (bool, error) {
// we need to initialize a new model since the model extracted from schema doesn't have id
expectedAuthModel := openfgav1.AuthorizationModel{
SchemaVersion: expected.SchemaVersion,
TypeDefinitions: expected.TypeDefinitions,
Conditions: expected.Conditions,
}
expectedAuthModelBytes, err := protojson.Marshal(&expectedAuthModel)
if err != nil {
return false, err
}
actualAuthModel := openfgav1.AuthorizationModel{
SchemaVersion: actual.SchemaVersion,
TypeDefinitions: actual.TypeDefinitions,
Conditions: actual.Conditions,
}
actualAuthModelBytes, err := protojson.Marshal(&actualAuthModel)
if err != nil {
return false, err
}
return string(expectedAuthModelBytes) == string(actualAuthModelBytes), nil
func (setter *provider) Create(_ context.Context, _ valuer.UUID, _ *roletypes.Role) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (provider *provider) getStoreIDandModelID() (string, string) {
provider.mtx.RLock()
defer provider.mtx.RUnlock()
storeID := provider.storeID
modelID := provider.modelID
return storeID, modelID
func (provider *provider) GetOrCreate(_ context.Context, _ valuer.UUID, _ *roletypes.Role) (*roletypes.Role, error) {
return nil, errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (provider *provider) GetResources(_ context.Context) []*authtypes.Resource {
return nil
}
func (provider *provider) GetObjects(ctx context.Context, orgID valuer.UUID, id valuer.UUID, relation authtypes.Relation) ([]*authtypes.Object, error) {
return nil, errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (provider *provider) Patch(_ context.Context, _ valuer.UUID, _ *roletypes.Role) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (provider *provider) PatchObjects(_ context.Context, _ valuer.UUID, _ string, _ authtypes.Relation, _, _ []*authtypes.Object) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (provider *provider) Delete(_ context.Context, _ valuer.UUID, _ valuer.UUID) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (provider *provider) MustGetTypeables() []authtypes.Typeable {
return nil
}

View File

@@ -1,4 +1,4 @@
package openfgaauthz
package openfgaserver
import (
"context"

View File

@@ -0,0 +1,334 @@
package openfgaserver
import (
"context"
"strconv"
"sync"
authz "github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/valuer"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/sqlstore"
openfgav1 "github.com/openfga/api/proto/openfga/v1"
openfgapkgtransformer "github.com/openfga/language/pkg/go/transformer"
openfgapkgserver "github.com/openfga/openfga/pkg/server"
"google.golang.org/protobuf/encoding/protojson"
)
var (
openfgaDefaultStore = valuer.NewString("signoz")
)
type Server struct {
config authz.Config
settings factory.ScopedProviderSettings
openfgaSchema []openfgapkgtransformer.ModuleFile
openfgaServer *openfgapkgserver.Server
storeID string
modelID string
mtx sync.RWMutex
stopChan chan struct{}
}
func NewOpenfgaServer(ctx context.Context, settings factory.ProviderSettings, config authz.Config, sqlstore sqlstore.SQLStore, openfgaSchema []openfgapkgtransformer.ModuleFile) (*Server, error) {
scopedProviderSettings := factory.NewScopedProviderSettings(settings, "github.com/SigNoz/signoz/pkg/authz/openfgaauthz")
store, err := NewSQLStore(sqlstore)
if err != nil {
scopedProviderSettings.Logger().DebugContext(ctx, "failed to initialize sqlstore for authz")
return nil, err
}
// setup the openfga server
opts := []openfgapkgserver.OpenFGAServiceV1Option{
openfgapkgserver.WithDatastore(store),
openfgapkgserver.WithLogger(NewLogger(scopedProviderSettings.Logger())),
openfgapkgserver.WithContextPropagationToDatastore(true),
}
openfgaServer, err := openfgapkgserver.NewServerWithOpts(opts...)
if err != nil {
scopedProviderSettings.Logger().DebugContext(ctx, "failed to create authz server")
return nil, err
}
return &Server{
config: config,
settings: scopedProviderSettings,
openfgaServer: openfgaServer,
openfgaSchema: openfgaSchema,
mtx: sync.RWMutex{},
stopChan: make(chan struct{}),
}, nil
}
func (server *Server) Start(ctx context.Context) error {
storeID, err := server.getOrCreateStore(ctx, openfgaDefaultStore.StringValue())
if err != nil {
return err
}
modelID, err := server.getOrCreateModel(ctx, storeID)
if err != nil {
return err
}
server.mtx.Lock()
server.modelID = modelID
server.storeID = storeID
server.mtx.Unlock()
<-server.stopChan
return nil
}
func (server *Server) Stop(ctx context.Context) error {
server.openfgaServer.Close()
close(server.stopChan)
return nil
}
func (server *Server) Check(ctx context.Context, tupleReq *openfgav1.TupleKey) error {
storeID, modelID := server.getStoreIDandModelID()
checkResponse, err := server.openfgaServer.Check(
ctx,
&openfgav1.CheckRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
TupleKey: &openfgav1.CheckRequestTupleKey{
User: tupleReq.User,
Relation: tupleReq.Relation,
Object: tupleReq.Object,
},
})
if err != nil {
return errors.Newf(errors.TypeInternal, authtypes.ErrCodeAuthZUnavailable, "authorization server is unavailable").WithAdditional(err.Error())
}
if !checkResponse.Allowed {
return errors.Newf(errors.TypeForbidden, authtypes.ErrCodeAuthZForbidden, "subject %s cannot %s object %s", tupleReq.User, tupleReq.Relation, tupleReq.Object)
}
return nil
}
func (server *Server) BatchCheck(ctx context.Context, tupleReq []*openfgav1.TupleKey) error {
storeID, modelID := server.getStoreIDandModelID()
batchCheckItems := make([]*openfgav1.BatchCheckItem, 0)
for idx, tuple := range tupleReq {
batchCheckItems = append(batchCheckItems, &openfgav1.BatchCheckItem{
TupleKey: &openfgav1.CheckRequestTupleKey{
User: tuple.User,
Relation: tuple.Relation,
Object: tuple.Object,
},
// the batch check response is map[string] keyed by correlationID.
CorrelationId: strconv.Itoa(idx),
})
}
checkResponse, err := server.openfgaServer.BatchCheck(
ctx,
&openfgav1.BatchCheckRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
Checks: batchCheckItems,
})
if err != nil {
return errors.Newf(errors.TypeInternal, authtypes.ErrCodeAuthZUnavailable, "authorization server is unavailable").WithAdditional(err.Error())
}
for _, checkResponse := range checkResponse.Result {
if checkResponse.GetAllowed() {
return nil
}
}
return errors.Newf(errors.TypeForbidden, authtypes.ErrCodeAuthZForbidden, "subjects are not authorized for requested access")
}
func (server *Server) CheckWithTupleCreation(ctx context.Context, claims authtypes.Claims, orgID valuer.UUID, _ authtypes.Relation, _ authtypes.Typeable, _ []authtypes.Selector, roleSelectors []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableUser, claims.UserID, orgID, nil)
if err != nil {
return err
}
tuples, err := authtypes.TypeableRole.Tuples(subject, authtypes.RelationAssignee, roleSelectors, orgID)
if err != nil {
return err
}
err = server.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
}
func (server *Server) CheckWithTupleCreationWithoutClaims(ctx context.Context, orgID valuer.UUID, _ authtypes.Relation, _ authtypes.Typeable, _ []authtypes.Selector, roleSelectors []authtypes.Selector) error {
subject, err := authtypes.NewSubject(authtypes.TypeableAnonymous, authtypes.AnonymousUser.String(), orgID, nil)
if err != nil {
return err
}
tuples, err := authtypes.TypeableRole.Tuples(subject, authtypes.RelationAssignee, roleSelectors, orgID)
if err != nil {
return err
}
err = server.BatchCheck(ctx, tuples)
if err != nil {
return err
}
return nil
}
func (server *Server) Write(ctx context.Context, additions []*openfgav1.TupleKey, deletions []*openfgav1.TupleKey) error {
if len(additions) == 0 && len(deletions) == 0 {
return nil
}
storeID, modelID := server.getStoreIDandModelID()
deletionTuplesWithoutCondition := make([]*openfgav1.TupleKeyWithoutCondition, len(deletions))
for idx, tuple := range deletions {
deletionTuplesWithoutCondition[idx] = &openfgav1.TupleKeyWithoutCondition{User: tuple.User, Object: tuple.Object, Relation: tuple.Relation}
}
_, err := server.openfgaServer.Write(ctx, &openfgav1.WriteRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
Writes: func() *openfgav1.WriteRequestWrites {
if len(additions) == 0 {
return nil
}
return &openfgav1.WriteRequestWrites{
TupleKeys: additions,
OnDuplicate: "ignore",
}
}(),
Deletes: func() *openfgav1.WriteRequestDeletes {
if len(deletionTuplesWithoutCondition) == 0 {
return nil
}
return &openfgav1.WriteRequestDeletes{
TupleKeys: deletionTuplesWithoutCondition,
OnMissing: "ignore",
}
}(),
})
return err
}
func (server *Server) ListObjects(ctx context.Context, subject string, relation authtypes.Relation, typeable authtypes.Typeable) ([]*authtypes.Object, error) {
storeID, modelID := server.getStoreIDandModelID()
response, err := server.openfgaServer.ListObjects(ctx, &openfgav1.ListObjectsRequest{
StoreId: storeID,
AuthorizationModelId: modelID,
User: subject,
Relation: relation.StringValue(),
Type: typeable.Type().StringValue(),
})
if err != nil {
return nil, errors.Wrapf(err, errors.TypeInternal, authtypes.ErrCodeAuthZUnavailable, "cannot list objects for subject %s with relation %s for type %s", subject, relation.StringValue(), typeable.Type().StringValue())
}
return authtypes.MustNewObjectsFromStringSlice(response.Objects), nil
}
func (server *Server) getOrCreateStore(ctx context.Context, name string) (string, error) {
stores, err := server.openfgaServer.ListStores(ctx, &openfgav1.ListStoresRequest{})
if err != nil {
return "", err
}
for _, store := range stores.GetStores() {
if store.GetName() == name {
return store.Id, nil
}
}
store, err := server.openfgaServer.CreateStore(ctx, &openfgav1.CreateStoreRequest{Name: name})
if err != nil {
return "", err
}
return store.Id, nil
}
func (server *Server) getOrCreateModel(ctx context.Context, storeID string) (string, error) {
schema, err := openfgapkgtransformer.TransformModuleFilesToModel(server.openfgaSchema, "1.1")
if err != nil {
return "", err
}
authorisationModels, err := server.openfgaServer.ReadAuthorizationModels(ctx, &openfgav1.ReadAuthorizationModelsRequest{StoreId: storeID})
if err != nil {
return "", err
}
for _, authModel := range authorisationModels.GetAuthorizationModels() {
equal, err := server.isModelEqual(schema, authModel)
if err != nil {
return "", err
}
if equal {
return authModel.Id, nil
}
}
authorizationModel, err := server.openfgaServer.WriteAuthorizationModel(ctx, &openfgav1.WriteAuthorizationModelRequest{
StoreId: storeID,
TypeDefinitions: schema.TypeDefinitions,
SchemaVersion: schema.SchemaVersion,
Conditions: schema.Conditions,
})
if err != nil {
return "", err
}
return authorizationModel.AuthorizationModelId, nil
}
// the language model doesn't have any equality check
// https://github.com/openfga/language/blob/main/pkg/go/transformer/module-to-model_test.go#L38
func (server *Server) isModelEqual(expected *openfgav1.AuthorizationModel, actual *openfgav1.AuthorizationModel) (bool, error) {
// we need to initialize a new model since the model extracted from schema doesn't have id
expectedAuthModel := openfgav1.AuthorizationModel{
SchemaVersion: expected.SchemaVersion,
TypeDefinitions: expected.TypeDefinitions,
Conditions: expected.Conditions,
}
expectedAuthModelBytes, err := protojson.Marshal(&expectedAuthModel)
if err != nil {
return false, err
}
actualAuthModel := openfgav1.AuthorizationModel{
SchemaVersion: actual.SchemaVersion,
TypeDefinitions: actual.TypeDefinitions,
Conditions: actual.Conditions,
}
actualAuthModelBytes, err := protojson.Marshal(&actualAuthModel)
if err != nil {
return false, err
}
return string(expectedAuthModelBytes) == string(actualAuthModelBytes), nil
}
func (server *Server) getStoreIDandModelID() (string, string) {
server.mtx.RLock()
defer server.mtx.RUnlock()
storeID := server.storeID
modelID := server.modelID
return storeID, modelID
}

View File

@@ -1,4 +1,4 @@
package openfgaauthz
package openfgaserver
import (
"context"
@@ -20,7 +20,7 @@ func TestProviderStartStop(t *testing.T) {
expectedModel := `module base
type user`
provider, err := newOpenfgaProvider(context.Background(), providerSettings, authz.Config{}, sqlstore, []transformer.ModuleFile{{Name: "test.fga", Contents: expectedModel}})
provider, err := NewOpenfgaServer(context.Background(), providerSettings, authz.Config{}, sqlstore, []transformer.ModuleFile{{Name: "test.fga", Contents: expectedModel}})
require.NoError(t, err)
storeRows := sqlstore.Mock().NewRows([]string{"id", "name", "created_at", "updated_at"}).AddRow("01K3V0NTN47MPTMEV1PD5ST6ZC", "signoz", time.Now(), time.Now())

View File

@@ -1,4 +1,4 @@
package openfgaauthz
package openfgaserver
import (
"github.com/SigNoz/signoz/pkg/errors"

View File

@@ -1,12 +1,12 @@
package implrole
package signozauthzapi
import (
"net/http"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/binding"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
@@ -14,12 +14,11 @@ import (
)
type handler struct {
setter role.Setter
getter role.Getter
authz authz.AuthZ
}
func NewHandler(setter role.Setter, getter role.Getter) role.Handler {
return &handler{setter: setter, getter: getter}
func NewHandler(authz authz.AuthZ) authz.Handler {
return &handler{authz: authz}
}
func (handler *handler) Create(rw http.ResponseWriter, r *http.Request) {
@@ -36,7 +35,7 @@ func (handler *handler) Create(rw http.ResponseWriter, r *http.Request) {
return
}
err = handler.setter.Create(ctx, valuer.MustNewUUID(claims.OrgID), roletypes.NewRole(req.Name, req.Description, roletypes.RoleTypeCustom, valuer.MustNewUUID(claims.OrgID)))
err = handler.authz.Create(ctx, valuer.MustNewUUID(claims.OrgID), roletypes.NewRole(req.Name, req.Description, roletypes.RoleTypeCustom, valuer.MustNewUUID(claims.OrgID)))
if err != nil {
render.Error(rw, err)
return
@@ -64,7 +63,7 @@ func (handler *handler) Get(rw http.ResponseWriter, r *http.Request) {
return
}
role, err := handler.getter.Get(ctx, valuer.MustNewUUID(claims.OrgID), roleID)
role, err := handler.authz.Get(ctx, valuer.MustNewUUID(claims.OrgID), roleID)
if err != nil {
render.Error(rw, err)
return
@@ -103,7 +102,7 @@ func (handler *handler) GetObjects(rw http.ResponseWriter, r *http.Request) {
return
}
objects, err := handler.setter.GetObjects(ctx, valuer.MustNewUUID(claims.OrgID), roleID, relation)
objects, err := handler.authz.GetObjects(ctx, valuer.MustNewUUID(claims.OrgID), roleID, relation)
if err != nil {
render.Error(rw, err)
return
@@ -114,7 +113,7 @@ func (handler *handler) GetObjects(rw http.ResponseWriter, r *http.Request) {
func (handler *handler) GetResources(rw http.ResponseWriter, r *http.Request) {
ctx := r.Context()
resources := handler.setter.GetResources(ctx)
resources := handler.authz.GetResources(ctx)
var resourceRelations = struct {
Resources []*authtypes.Resource `json:"resources"`
@@ -134,7 +133,7 @@ func (handler *handler) List(rw http.ResponseWriter, r *http.Request) {
return
}
roles, err := handler.getter.List(ctx, valuer.MustNewUUID(claims.OrgID))
roles, err := handler.authz.List(ctx, valuer.MustNewUUID(claims.OrgID))
if err != nil {
render.Error(rw, err)
return
@@ -163,7 +162,7 @@ func (handler *handler) Patch(rw http.ResponseWriter, r *http.Request) {
return
}
role, err := handler.getter.Get(ctx, valuer.MustNewUUID(claims.OrgID), id)
role, err := handler.authz.Get(ctx, valuer.MustNewUUID(claims.OrgID), id)
if err != nil {
render.Error(rw, err)
return
@@ -175,7 +174,7 @@ func (handler *handler) Patch(rw http.ResponseWriter, r *http.Request) {
return
}
err = handler.setter.Patch(ctx, valuer.MustNewUUID(claims.OrgID), role)
err = handler.authz.Patch(ctx, valuer.MustNewUUID(claims.OrgID), role)
if err != nil {
render.Error(rw, err)
return
@@ -210,7 +209,7 @@ func (handler *handler) PatchObjects(rw http.ResponseWriter, r *http.Request) {
return
}
role, err := handler.getter.Get(ctx, valuer.MustNewUUID(claims.OrgID), id)
role, err := handler.authz.Get(ctx, valuer.MustNewUUID(claims.OrgID), id)
if err != nil {
render.Error(rw, err)
return
@@ -222,7 +221,7 @@ func (handler *handler) PatchObjects(rw http.ResponseWriter, r *http.Request) {
return
}
err = handler.setter.PatchObjects(ctx, valuer.MustNewUUID(claims.OrgID), role.Name, relation, patchableObjects.Additions, patchableObjects.Deletions)
err = handler.authz.PatchObjects(ctx, valuer.MustNewUUID(claims.OrgID), role.Name, relation, patchableObjects.Additions, patchableObjects.Deletions)
if err != nil {
render.Error(rw, err)
return
@@ -245,7 +244,7 @@ func (handler *handler) Delete(rw http.ResponseWriter, r *http.Request) {
return
}
err = handler.setter.Delete(ctx, valuer.MustNewUUID(claims.OrgID), id)
err = handler.authz.Delete(ctx, valuer.MustNewUUID(claims.OrgID), id)
if err != nil {
render.Error(rw, err)
return

View File

@@ -8,7 +8,6 @@ import (
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/http/render"
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/ctxtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
@@ -24,15 +23,14 @@ type AuthZ struct {
logger *slog.Logger
orgGetter organization.Getter
authzService authz.AuthZ
roleGetter role.Getter
}
func NewAuthZ(logger *slog.Logger, orgGetter organization.Getter, authzService authz.AuthZ, roleGetter role.Getter) *AuthZ {
func NewAuthZ(logger *slog.Logger, orgGetter organization.Getter, authzService authz.AuthZ) *AuthZ {
if logger == nil {
panic("cannot build authz middleware, logger is empty")
}
return &AuthZ{logger: logger, orgGetter: orgGetter, authzService: authzService, roleGetter: roleGetter}
return &AuthZ{logger: logger, orgGetter: orgGetter, authzService: authzService}
}
func (middleware *AuthZ) ViewAccess(next http.HandlerFunc) http.HandlerFunc {

View File

@@ -4,7 +4,7 @@ import (
"context"
"net/http"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/statsreporter"
"github.com/SigNoz/signoz/pkg/types"
"github.com/SigNoz/signoz/pkg/types/authtypes"
@@ -51,7 +51,7 @@ type Module interface {
statsreporter.StatsCollector
role.RegisterTypeable
authz.RegisterTypeable
}
type Handler interface {

View File

@@ -206,6 +206,10 @@ func (module *module) MustGetTypeables() []authtypes.Typeable {
return []authtypes.Typeable{dashboardtypes.TypeableMetaResourceDashboard, dashboardtypes.TypeableMetaResourcesDashboards}
}
func (module *module) MustGetManagedRoleTransactions() map[string][]*authtypes.Transaction {
return nil
}
// not supported
func (module *module) CreatePublic(ctx context.Context, orgID valuer.UUID, publicDashboard *dashboardtypes.PublicDashboard) error {
return errors.Newf(errors.TypeUnsupported, dashboardtypes.ErrCodePublicDashboardUnsupported, "not implemented")

View File

@@ -1,63 +0,0 @@
package implrole
import (
"context"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
type getter struct {
store roletypes.Store
}
func NewGetter(store roletypes.Store) role.Getter {
return &getter{store: store}
}
func (getter *getter) Get(ctx context.Context, orgID valuer.UUID, id valuer.UUID) (*roletypes.Role, error) {
storableRole, err := getter.store.Get(ctx, orgID, id)
if err != nil {
return nil, err
}
return roletypes.NewRoleFromStorableRole(storableRole), nil
}
func (getter *getter) GetByOrgIDAndName(ctx context.Context, orgID valuer.UUID, name string) (*roletypes.Role, error) {
storableRole, err := getter.store.GetByOrgIDAndName(ctx, orgID, name)
if err != nil {
return nil, err
}
return roletypes.NewRoleFromStorableRole(storableRole), nil
}
func (getter *getter) List(ctx context.Context, orgID valuer.UUID) ([]*roletypes.Role, error) {
storableRoles, err := getter.store.List(ctx, orgID)
if err != nil {
return nil, err
}
roles := make([]*roletypes.Role, len(storableRoles))
for idx, storableRole := range storableRoles {
roles[idx] = roletypes.NewRoleFromStorableRole(storableRole)
}
return roles, nil
}
func (getter *getter) ListByOrgIDAndNames(ctx context.Context, orgID valuer.UUID, names []string) ([]*roletypes.Role, error) {
storableRoles, err := getter.store.ListByOrgIDAndNames(ctx, orgID, names)
if err != nil {
return nil, err
}
roles := make([]*roletypes.Role, len(storableRoles))
for idx, storable := range storableRoles {
roles[idx] = roletypes.NewRoleFromStorableRole(storable)
}
return roles, nil
}

View File

@@ -1,83 +0,0 @@
package implrole
import (
"context"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
type granter struct {
store roletypes.Store
authz authz.AuthZ
}
func NewGranter(store roletypes.Store, authz authz.AuthZ) role.Granter {
return &granter{store: store, authz: authz}
}
func (granter *granter) Grant(ctx context.Context, orgID valuer.UUID, name string, subject string) error {
tuples, err := authtypes.TypeableRole.Tuples(
subject,
authtypes.RelationAssignee,
[]authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeRole, name),
},
orgID,
)
if err != nil {
return err
}
return granter.authz.Write(ctx, tuples, nil)
}
func (granter *granter) ModifyGrant(ctx context.Context, orgID valuer.UUID, existingRoleName string, updatedRoleName string, subject string) error {
err := granter.Revoke(ctx, orgID, existingRoleName, subject)
if err != nil {
return err
}
err = granter.Grant(ctx, orgID, updatedRoleName, subject)
if err != nil {
return err
}
return nil
}
func (granter *granter) Revoke(ctx context.Context, orgID valuer.UUID, name string, subject string) error {
tuples, err := authtypes.TypeableRole.Tuples(
subject,
authtypes.RelationAssignee,
[]authtypes.Selector{
authtypes.MustNewSelector(authtypes.TypeRole, name),
},
orgID,
)
if err != nil {
return err
}
return granter.authz.Write(ctx, nil, tuples)
}
func (granter *granter) CreateManagedRoles(ctx context.Context, _ valuer.UUID, managedRoles []*roletypes.Role) error {
err := granter.store.RunInTx(ctx, func(ctx context.Context) error {
for _, role := range managedRoles {
err := granter.store.Create(ctx, roletypes.NewStorableRoleFromRole(role))
if err != nil {
return err
}
}
return nil
})
if err != nil {
return err
}
return nil
}

View File

@@ -1,53 +0,0 @@
package implrole
import (
"context"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
type setter struct {
store roletypes.Store
authz authz.AuthZ
}
func NewSetter(store roletypes.Store, authz authz.AuthZ) role.Setter {
return &setter{store: store, authz: authz}
}
func (setter *setter) Create(_ context.Context, _ valuer.UUID, _ *roletypes.Role) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (setter *setter) GetOrCreate(_ context.Context, _ valuer.UUID, _ *roletypes.Role) (*roletypes.Role, error) {
return nil, errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (setter *setter) GetResources(_ context.Context) []*authtypes.Resource {
return nil
}
func (setter *setter) GetObjects(ctx context.Context, orgID valuer.UUID, id valuer.UUID, relation authtypes.Relation) ([]*authtypes.Object, error) {
return nil, errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (setter *setter) Patch(_ context.Context, _ valuer.UUID, _ *roletypes.Role) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (setter *setter) PatchObjects(_ context.Context, _ valuer.UUID, _ string, _ authtypes.Relation, _, _ []*authtypes.Object) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (setter *setter) Delete(_ context.Context, _ valuer.UUID, _ valuer.UUID) error {
return errors.Newf(errors.TypeUnsupported, roletypes.ErrCodeRoleUnsupported, "not implemented")
}
func (setter *setter) MustGetTypeables() []authtypes.Typeable {
return nil
}

View File

@@ -1,85 +0,0 @@
package role
import (
"context"
"net/http"
"github.com/SigNoz/signoz/pkg/types/authtypes"
"github.com/SigNoz/signoz/pkg/types/roletypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
type Setter interface {
// Creates the role.
Create(context.Context, valuer.UUID, *roletypes.Role) error
// Gets the role if it exists or creates one.
GetOrCreate(context.Context, valuer.UUID, *roletypes.Role) (*roletypes.Role, error)
// Gets the objects associated with the given role and relation.
GetObjects(context.Context, valuer.UUID, valuer.UUID, authtypes.Relation) ([]*authtypes.Object, error)
// Gets all the typeable resources registered from role registry.
GetResources(context.Context) []*authtypes.Resource
// Patches the role.
Patch(context.Context, valuer.UUID, *roletypes.Role) error
// Patches the objects in authorization server associated with the given role and relation
PatchObjects(context.Context, valuer.UUID, string, authtypes.Relation, []*authtypes.Object, []*authtypes.Object) error
// Deletes the role and tuples in authorization server.
Delete(context.Context, valuer.UUID, valuer.UUID) error
RegisterTypeable
}
type Getter interface {
// Gets the role
Get(context.Context, valuer.UUID, valuer.UUID) (*roletypes.Role, error)
// Gets the role by org_id and name
GetByOrgIDAndName(context.Context, valuer.UUID, string) (*roletypes.Role, error)
// Lists all the roles for the organization.
List(context.Context, valuer.UUID) ([]*roletypes.Role, error)
// Lists all the roles for the organization filtered by name
ListByOrgIDAndNames(context.Context, valuer.UUID, []string) ([]*roletypes.Role, error)
}
type Granter interface {
// Grants a role to the subject based on role name.
Grant(context.Context, valuer.UUID, string, string) error
// Revokes a granted role from the subject based on role name.
Revoke(context.Context, valuer.UUID, string, string) error
// Changes the granted role for the subject based on role name.
ModifyGrant(context.Context, valuer.UUID, string, string, string) error
// Bootstrap the managed roles.
CreateManagedRoles(context.Context, valuer.UUID, []*roletypes.Role) error
}
type RegisterTypeable interface {
MustGetTypeables() []authtypes.Typeable
}
type Handler interface {
Create(http.ResponseWriter, *http.Request)
Get(http.ResponseWriter, *http.Request)
GetObjects(http.ResponseWriter, *http.Request)
GetResources(http.ResponseWriter, *http.Request)
List(http.ResponseWriter, *http.Request)
Patch(http.ResponseWriter, *http.Request)
PatchObjects(http.ResponseWriter, *http.Request)
Delete(http.ResponseWriter, *http.Request)
}

View File

@@ -8,11 +8,11 @@ import (
"time"
"github.com/SigNoz/signoz/pkg/analytics"
"github.com/SigNoz/signoz/pkg/authz"
"github.com/SigNoz/signoz/pkg/emailing"
"github.com/SigNoz/signoz/pkg/errors"
"github.com/SigNoz/signoz/pkg/factory"
"github.com/SigNoz/signoz/pkg/modules/organization"
"github.com/SigNoz/signoz/pkg/modules/role"
"github.com/SigNoz/signoz/pkg/modules/user"
root "github.com/SigNoz/signoz/pkg/modules/user"
"github.com/SigNoz/signoz/pkg/tokenizer"
@@ -32,13 +32,13 @@ type Module struct {
emailing emailing.Emailing
settings factory.ScopedProviderSettings
orgSetter organization.Setter
granter role.Granter
authz authz.AuthZ
analytics analytics.Analytics
config user.Config
}
// This module is a WIP, don't take inspiration from this.
func NewModule(store types.UserStore, tokenizer tokenizer.Tokenizer, emailing emailing.Emailing, providerSettings factory.ProviderSettings, orgSetter organization.Setter, granter role.Granter, analytics analytics.Analytics, config user.Config) root.Module {
func NewModule(store types.UserStore, tokenizer tokenizer.Tokenizer, emailing emailing.Emailing, providerSettings factory.ProviderSettings, orgSetter organization.Setter, authz authz.AuthZ, analytics analytics.Analytics, config user.Config) root.Module {
settings := factory.NewScopedProviderSettings(providerSettings, "github.com/SigNoz/signoz/pkg/modules/user/impluser")
return &Module{
store: store,
@@ -47,7 +47,7 @@ func NewModule(store types.UserStore, tokenizer tokenizer.Tokenizer, emailing em
settings: settings,
orgSetter: orgSetter,
analytics: analytics,
granter: granter,
authz: authz,
config: config,
}
}
@@ -172,7 +172,7 @@ func (module *Module) CreateUser(ctx context.Context, input *types.User, opts ..
createUserOpts := root.NewCreateUserOptions(opts...)
// since assign is idempotant multiple calls to assign won't cause issues in case of retries.
err := module.granter.Grant(ctx, input.OrgID, roletypes.MustGetSigNozManagedRoleFromExistingRole(input.Role), authtypes.MustNewSubject(authtypes.TypeableUser, input.ID.StringValue(), input.OrgID, nil))
err := module.authz.Grant(ctx, input.OrgID, roletypes.MustGetSigNozManagedRoleFromExistingRole(input.Role), authtypes.MustNewSubject(authtypes.TypeableUser, input.ID.StringValue(), input.OrgID, nil))
if err != nil {
return err
}
@@ -238,7 +238,7 @@ func (m *Module) UpdateUser(ctx context.Context, orgID valuer.UUID, id string, u
}
if user.Role != existingUser.Role {
err = m.granter.ModifyGrant(ctx,
err = m.authz.ModifyGrant(ctx,
orgID,
roletypes.MustGetSigNozManagedRoleFromExistingRole(existingUser.Role),
roletypes.MustGetSigNozManagedRoleFromExistingRole(user.Role),
@@ -301,7 +301,7 @@ func (module *Module) DeleteUser(ctx context.Context, orgID valuer.UUID, id stri
}
// since revoke is idempotant multiple calls to revoke won't cause issues in case of retries
err = module.granter.Revoke(ctx, orgID, roletypes.MustGetSigNozManagedRoleFromExistingRole(user.Role), authtypes.MustNewSubject(authtypes.TypeableUser, id, orgID, nil))
err = module.authz.Revoke(ctx, orgID, roletypes.MustGetSigNozManagedRoleFromExistingRole(user.Role), authtypes.MustNewSubject(authtypes.TypeableUser, id, orgID, nil))
if err != nil {
return err
}
@@ -504,14 +504,14 @@ func (module *Module) CreateFirstUser(ctx context.Context, organization *types.O
}
managedRoles := roletypes.NewManagedRoles(organization.ID)
err = module.granter.Grant(ctx, organization.ID, roletypes.SigNozAdminRoleName, authtypes.MustNewSubject(authtypes.TypeableUser, user.ID.StringValue(), user.OrgID, nil))
err = module.authz.CreateManagedUserRoleTransactions(ctx, organization.ID, user.ID)
if err != nil {
return nil, err
}
if err = module.store.RunInTx(ctx, func(ctx context.Context) error {
err = module.orgSetter.Create(ctx, organization, func(ctx context.Context, orgID valuer.UUID) error {
err = module.granter.CreateManagedRoles(ctx, orgID, managedRoles)
err = module.authz.CreateManagedRoles(ctx, orgID, managedRoles)
if err != nil {
return err
}

View File

@@ -208,7 +208,16 @@ func (q *querier) QueryRange(ctx context.Context, orgID valuer.UUID, req *qbtype
event.GroupByApplied = len(spec.GroupBy) > 0
if spec.Source == telemetrytypes.SourceMeter {
spec.StepInterval = qbtypes.Step{Duration: time.Second * time.Duration(querybuilder.RecommendedStepIntervalForMeter(req.Start, req.End))}
if spec.StepInterval.Seconds() == 0 {
spec.StepInterval = qbtypes.Step{Duration: time.Second * time.Duration(querybuilder.RecommendedStepIntervalForMeter(req.Start, req.End))}
}
if spec.StepInterval.Seconds() < float64(querybuilder.MinAllowedStepIntervalForMeter(req.Start, req.End)) {
newStep := qbtypes.Step{
Duration: time.Second * time.Duration(querybuilder.MinAllowedStepIntervalForMeter(req.Start, req.End)),
}
spec.StepInterval = newStep
}
} else {
if spec.StepInterval.Seconds() == 0 {
spec.StepInterval = qbtypes.Step{

View File

@@ -830,7 +830,7 @@ func (r *ClickHouseReader) GetUsage(ctx context.Context, queryParams *model.GetU
func (r *ClickHouseReader) GetSpansForTrace(ctx context.Context, traceID string, traceDetailsQuery string) ([]model.SpanItemV2, *model.ApiError) {
var traceSummary model.TraceSummary
summaryQuery := fmt.Sprintf("SELECT * from %s.%s WHERE trace_id=$1", r.TraceDB, r.traceSummaryTable)
summaryQuery := fmt.Sprintf("SELECT trace_id, min(start) AS start, max(end) AS end, sum(num_spans) AS num_spans FROM %s.%s WHERE trace_id=$1 GROUP BY trace_id", r.TraceDB, r.traceSummaryTable)
err := r.db.QueryRow(ctx, summaryQuery, traceID).Scan(&traceSummary.TraceID, &traceSummary.Start, &traceSummary.End, &traceSummary.NumSpans)
if err != nil {
if err == sql.ErrNoRows {
@@ -6458,7 +6458,7 @@ func (r *ClickHouseReader) SearchTraces(ctx context.Context, params *model.Searc
}
var traceSummary model.TraceSummary
summaryQuery := fmt.Sprintf("SELECT * from %s.%s WHERE trace_id=$1", r.TraceDB, r.traceSummaryTable)
summaryQuery := fmt.Sprintf("SELECT trace_id, min(start) AS start, max(end) AS end, sum(num_spans) AS num_spans FROM %s.%s WHERE trace_id=$1 GROUP BY trace_id", r.TraceDB, r.traceSummaryTable)
err := r.db.QueryRow(ctx, summaryQuery, params.TraceID).Scan(&traceSummary.TraceID, &traceSummary.Start, &traceSummary.End, &traceSummary.NumSpans)
if err != nil {
if err == sql.ErrNoRows {

View File

@@ -68,7 +68,7 @@ import (
"github.com/SigNoz/signoz/pkg/types/opamptypes"
"github.com/SigNoz/signoz/pkg/types/pipelinetypes"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
ruletypes "github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/ruletypes"
traceFunnels "github.com/SigNoz/signoz/pkg/types/tracefunneltypes"
"go.uber.org/zap"
@@ -103,10 +103,11 @@ type APIHandler struct {
querierV2 interfaces.Querier
queryBuilder *queryBuilder.QueryBuilder
// temporalityMap is a map of metric name to temporality
// to avoid fetching temporality for the same metric multiple times
// querying the v4 table on low cardinal temporality column
// should be fast but we can still avoid the query if we have the data in memory
// temporalityMap is a map of metric name to temporality to avoid fetching
// temporality for the same metric multiple times.
//
// Querying the v4 table on a low cardinal temporality column should be
// fast, but we can still avoid the query if we have the data in memory.
temporalityMap map[string]map[v3.Temporality]bool
temporalityMux sync.Mutex
@@ -1010,7 +1011,7 @@ func (aH *APIHandler) getRuleStateHistory(w http.ResponseWriter, r *http.Request
// the query range is calculated based on the rule's evalWindow and evalDelay
// alerts have 2 minutes delay built in, so we need to subtract that from the start time
// to get the correct query range
start := end.Add(-time.Duration(rule.EvalWindow)).Add(-3 * time.Minute)
start := end.Add(-rule.EvalWindow.Duration() - 3*time.Minute)
if rule.AlertType == ruletypes.AlertTypeLogs {
if rule.Version != "v5" {
res.Items[idx].RelatedLogsLink = contextlinks.PrepareLinksToLogs(start, end, newFilters)
@@ -1217,12 +1218,12 @@ func (aH *APIHandler) Get(rw http.ResponseWriter, r *http.Request) {
dashboard := new(dashboardtypes.Dashboard)
if aH.CloudIntegrationsController.IsCloudIntegrationDashboardUuid(id) {
cloudintegrationDashboard, apiErr := aH.CloudIntegrationsController.GetDashboardById(ctx, orgID, id)
cloudIntegrationDashboard, apiErr := aH.CloudIntegrationsController.GetDashboardById(ctx, orgID, id)
if apiErr != nil {
render.Error(rw, errorsV2.Wrapf(apiErr, errorsV2.TypeInternal, errorsV2.CodeInternal, "failed to get dashboard"))
return
}
dashboard = cloudintegrationDashboard
dashboard = cloudIntegrationDashboard
} else if aH.IntegrationsController.IsInstalledIntegrationDashboardID(id) {
integrationDashboard, apiErr := aH.IntegrationsController.GetInstalledIntegrationDashboardById(ctx, orgID, id)
if apiErr != nil {
@@ -1551,13 +1552,13 @@ func (aH *APIHandler) queryMetrics(w http.ResponseWriter, r *http.Request) {
RespondError(w, &model.ApiError{Typ: model.ErrorExec, Err: res.Err}, nil)
}
response_data := &model.QueryData{
responseData := &model.QueryData{
ResultType: res.Value.Type(),
Result: res.Value,
Stats: qs,
}
aH.Respond(w, response_data)
aH.Respond(w, responseData)
}
@@ -2639,12 +2640,12 @@ func (aH *APIHandler) getProducerData(w http.ResponseWriter, r *http.Request) {
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -2692,12 +2693,12 @@ func (aH *APIHandler) getConsumerData(w http.ResponseWriter, r *http.Request) {
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -2746,12 +2747,12 @@ func (aH *APIHandler) getPartitionOverviewLatencyData(w http.ResponseWriter, r *
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -2800,12 +2801,12 @@ func (aH *APIHandler) getConsumerPartitionLatencyData(w http.ResponseWriter, r *
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -2857,12 +2858,12 @@ func (aH *APIHandler) getProducerThroughputOverview(w http.ResponseWriter, r *ht
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, producerQueryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, producerQueryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
@@ -2968,12 +2969,12 @@ func (aH *APIHandler) getProducerThroughputDetails(w http.ResponseWriter, r *htt
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -3022,12 +3023,12 @@ func (aH *APIHandler) getConsumerThroughputOverview(w http.ResponseWriter, r *ht
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -3076,12 +3077,12 @@ func (aH *APIHandler) getConsumerThroughputDetails(w http.ResponseWriter, r *htt
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
result = postprocess.TransformToTableForClickHouseQueries(result)
@@ -3136,12 +3137,12 @@ func (aH *APIHandler) getProducerConsumerEval(w http.ResponseWriter, r *http.Req
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
result, errQuriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(r.Context(), orgID, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
@@ -4125,11 +4126,11 @@ func (aH *APIHandler) ListLogsPipelinesHandler(w http.ResponseWriter, r *http.Re
aH.Respond(w, payload)
}
// listLogsPipelines lists logs piplines for latest version
// listLogsPipelines lists logs pipelines for latest version
func (aH *APIHandler) listLogsPipelines(ctx context.Context, orgID valuer.UUID) (
*logparsingpipeline.PipelinesResponse, error,
) {
// get lateset agent config
// get latest agent config
latestVersion := -1
lastestConfig, err := agentConf.GetLatestVersion(ctx, orgID, opamptypes.ElementTypeLogPipelines)
if err != nil && !errorsV2.Ast(err, errorsV2.TypeNotFound) {
@@ -4426,7 +4427,7 @@ func (aH *APIHandler) queryRangeV3(ctx context.Context, queryRangeParams *v3.Que
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
var spanKeys map[string]v3.AttributeKey
if queryRangeParams.CompositeQuery.QueryType == v3.QueryTypeBuilder {
hasLogsQuery := false
@@ -4443,7 +4444,7 @@ func (aH *APIHandler) queryRangeV3(ctx context.Context, queryRangeParams *v3.Que
if logsv3.EnrichmentRequired(queryRangeParams) && hasLogsQuery {
logsFields, apiErr := aH.reader.GetLogFieldsFromNames(ctx, logsv3.GetFieldNames(queryRangeParams.CompositeQuery))
if apiErr != nil {
RespondError(w, apiErr, errQuriesByName)
RespondError(w, apiErr, errQueriesByName)
return
}
// get the fields if any logs query is present
@@ -4454,7 +4455,7 @@ func (aH *APIHandler) queryRangeV3(ctx context.Context, queryRangeParams *v3.Que
spanKeys, err = aH.getSpanKeysV3(ctx, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorInternal, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
tracesV4.Enrich(queryRangeParams, spanKeys)
@@ -4499,11 +4500,11 @@ func (aH *APIHandler) queryRangeV3(ctx context.Context, queryRangeParams *v3.Que
}
}
result, errQuriesByName, err = aH.querier.QueryRange(ctx, orgID, queryRangeParams)
result, errQueriesByName, err = aH.querier.QueryRange(ctx, orgID, queryRangeParams)
if err != nil {
queryErrors := map[string]string{}
for name, err := range errQuriesByName {
for name, err := range errQueriesByName {
queryErrors[fmt.Sprintf("Query-%s", name)] = err.Error()
}
apiErrObj := &model.ApiError{Typ: model.ErrorInternal, Err: err}
@@ -4779,7 +4780,7 @@ func (aH *APIHandler) queryRangeV4(ctx context.Context, queryRangeParams *v3.Que
}
var result []*v3.Result
var errQuriesByName map[string]error
var errQueriesByName map[string]error
var spanKeys map[string]v3.AttributeKey
if queryRangeParams.CompositeQuery.QueryType == v3.QueryTypeBuilder {
hasLogsQuery := false
@@ -4809,7 +4810,7 @@ func (aH *APIHandler) queryRangeV4(ctx context.Context, queryRangeParams *v3.Que
spanKeys, err = aH.getSpanKeysV3(ctx, queryRangeParams)
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorInternal, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
tracesV4.Enrich(queryRangeParams, spanKeys)
@@ -4832,11 +4833,11 @@ func (aH *APIHandler) queryRangeV4(ctx context.Context, queryRangeParams *v3.Que
}
}
result, errQuriesByName, err = aH.querierV2.QueryRange(ctx, orgID, queryRangeParams)
result, errQueriesByName, err = aH.querierV2.QueryRange(ctx, orgID, queryRangeParams)
if err != nil {
queryErrors := map[string]string{}
for name, err := range errQuriesByName {
for name, err := range errQueriesByName {
queryErrors[fmt.Sprintf("Query-%s", name)] = err.Error()
}
apiErrObj := &model.ApiError{Typ: model.ErrorInternal, Err: err}
@@ -4853,7 +4854,7 @@ func (aH *APIHandler) queryRangeV4(ctx context.Context, queryRangeParams *v3.Que
if err != nil {
apiErrObj := &model.ApiError{Typ: model.ErrorBadData, Err: err}
RespondError(w, apiErrObj, errQuriesByName)
RespondError(w, apiErrObj, errQueriesByName)
return
}
aH.sendQueryResultEvents(r, result, queryRangeParams, "v4")

View File

@@ -207,7 +207,7 @@ func (s *Server) createPublicServer(api *APIHandler, web web.Web) (*http.Server,
r.Use(middleware.NewLogging(s.signoz.Instrumentation.Logger(), s.config.APIServer.Logging.ExcludedRoutes).Wrap)
r.Use(middleware.NewComment().Wrap)
am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger(), s.signoz.Modules.OrgGetter, s.signoz.Authz, s.signoz.Modules.RoleGetter)
am := middleware.NewAuthZ(s.signoz.Instrumentation.Logger(), s.signoz.Modules.OrgGetter, s.signoz.Authz)
api.RegisterRoutes(r, am)
api.RegisterLogsRoutes(r, am)

View File

@@ -5,10 +5,10 @@ import (
"os"
"regexp"
"strconv"
"time"
"github.com/SigNoz/signoz/pkg/query-service/model"
v3 "github.com/SigNoz/signoz/pkg/query-service/model/v3"
"github.com/SigNoz/signoz/pkg/valuer"
)
const (
@@ -40,11 +40,11 @@ const NormalizedMetricsMapQueryThreads = 10
var NormalizedMetricsMapRegex = regexp.MustCompile(`[^a-zA-Z0-9]`)
var NormalizedMetricsMapQuantileRegex = regexp.MustCompile(`(?i)([._-]?quantile.*)$`)
func GetEvalDelay() time.Duration {
func GetEvalDelay() valuer.TextDuration {
evalDelayStr := GetOrDefaultEnv("RULES_EVAL_DELAY", "2m")
evalDelayDuration, err := time.ParseDuration(evalDelayStr)
evalDelayDuration, err := valuer.ParseTextDuration(evalDelayStr)
if err != nil {
return 0
return valuer.TextDuration{}
}
return evalDelayDuration
}

View File

@@ -40,13 +40,13 @@ type BaseRule struct {
// evalWindow is the time window used for evaluating the rule
// i.e. each time we lookback from the current time, we look at data for the last
// evalWindow duration
evalWindow time.Duration
evalWindow valuer.TextDuration
// holdDuration is the duration for which the alert waits before firing
holdDuration time.Duration
holdDuration valuer.TextDuration
// evalDelay is the delay in evaluation of the rule
// this is useful in cases where the data is not available immediately
evalDelay time.Duration
evalDelay valuer.TextDuration
// holds the static set of labels and annotations for the rule
// these are the same for all alerts created for this rule
@@ -94,7 +94,7 @@ type BaseRule struct {
evaluation ruletypes.Evaluation
// newGroupEvalDelay is the grace period for new alert groups
newGroupEvalDelay *time.Duration
newGroupEvalDelay valuer.TextDuration
queryParser queryparser.QueryParser
}
@@ -113,7 +113,7 @@ func WithSendUnmatched() RuleOption {
}
}
func WithEvalDelay(dur time.Duration) RuleOption {
func WithEvalDelay(dur valuer.TextDuration) RuleOption {
return func(r *BaseRule) {
r.evalDelay = dur
}
@@ -163,7 +163,7 @@ func NewBaseRule(id string, orgID valuer.UUID, p *ruletypes.PostableRule, reader
source: p.Source,
typ: p.AlertType,
ruleCondition: p.RuleCondition,
evalWindow: time.Duration(p.EvalWindow),
evalWindow: p.EvalWindow,
labels: qslabels.FromMap(p.Labels),
annotations: qslabels.FromMap(p.Annotations),
preferredChannels: p.PreferredChannels,
@@ -176,13 +176,12 @@ func NewBaseRule(id string, orgID valuer.UUID, p *ruletypes.PostableRule, reader
}
// Store newGroupEvalDelay and groupBy keys from NotificationSettings
if p.NotificationSettings != nil && p.NotificationSettings.NewGroupEvalDelay != nil {
newGroupEvalDelay := time.Duration(*p.NotificationSettings.NewGroupEvalDelay)
baseRule.newGroupEvalDelay = &newGroupEvalDelay
if p.NotificationSettings != nil {
baseRule.newGroupEvalDelay = p.NotificationSettings.NewGroupEvalDelay
}
if baseRule.evalWindow == 0 {
baseRule.evalWindow = 5 * time.Minute
if baseRule.evalWindow.IsZero() {
baseRule.evalWindow = valuer.MustParseTextDuration("5m")
}
for _, opt := range opts {
@@ -245,15 +244,15 @@ func (r *BaseRule) ActiveAlertsLabelFP() map[uint64]struct{} {
return activeAlerts
}
func (r *BaseRule) EvalDelay() time.Duration {
func (r *BaseRule) EvalDelay() valuer.TextDuration {
return r.evalDelay
}
func (r *BaseRule) EvalWindow() time.Duration {
func (r *BaseRule) EvalWindow() valuer.TextDuration {
return r.evalWindow
}
func (r *BaseRule) HoldDuration() time.Duration {
func (r *BaseRule) HoldDuration() valuer.TextDuration {
return r.holdDuration
}
@@ -281,7 +280,7 @@ func (r *BaseRule) Timestamps(ts time.Time) (time.Time, time.Time) {
start := st.UnixMilli()
end := en.UnixMilli()
if r.evalDelay > 0 {
if r.evalDelay.IsPositive() {
start = start - r.evalDelay.Milliseconds()
end = end - r.evalDelay.Milliseconds()
}
@@ -552,7 +551,7 @@ func (r *BaseRule) PopulateTemporality(ctx context.Context, orgID valuer.UUID, q
// ShouldSkipNewGroups returns true if new group filtering should be applied
func (r *BaseRule) ShouldSkipNewGroups() bool {
return r.newGroupEvalDelay != nil && *r.newGroupEvalDelay > 0
return r.newGroupEvalDelay.IsPositive()
}
// isFilterNewSeriesSupported checks if the query is supported for new series filtering

View File

@@ -20,7 +20,7 @@ import (
"github.com/SigNoz/signoz/pkg/telemetrystore/telemetrystoretest"
"github.com/SigNoz/signoz/pkg/types/metrictypes"
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
ruletypes "github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes/telemetrytypestest"
"github.com/SigNoz/signoz/pkg/valuer"
@@ -124,8 +124,8 @@ func createPostableRule(compositeQuery *v3.CompositeQuery) ruletypes.PostableRul
Evaluation: &ruletypes.EvaluationEnvelope{
Kind: ruletypes.RollingEvaluation,
Spec: ruletypes.RollingWindow{
EvalWindow: ruletypes.Duration(5 * time.Minute),
Frequency: ruletypes.Duration(1 * time.Minute),
EvalWindow: valuer.MustParseTextDuration("5m"),
Frequency: valuer.MustParseTextDuration("1m"),
},
},
RuleCondition: &ruletypes.RuleCondition{
@@ -151,7 +151,7 @@ type filterNewSeriesTestCase struct {
compositeQuery *v3.CompositeQuery
series []*v3.Series
firstSeenMap map[telemetrytypes.MetricMetadataLookupKey]int64
newGroupEvalDelay *time.Duration
newGroupEvalDelay valuer.TextDuration
evalTime time.Time
expectedFiltered []*v3.Series // series that should be in the final filtered result (old enough)
expectError bool
@@ -159,7 +159,8 @@ type filterNewSeriesTestCase struct {
func TestBaseRule_FilterNewSeries(t *testing.T) {
defaultEvalTime := time.Unix(1700000000, 0)
defaultDelay := 2 * time.Minute
defaultNewGroupEvalDelay := valuer.MustParseTextDuration("2m")
defaultDelay := defaultNewGroupEvalDelay.Duration()
defaultGroupByFields := []string{"service_name", "env"}
logger := instrumentationtest.New().Logger()
@@ -202,7 +203,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, false, "svc-new", "prod"),
// svc-missing has no metadata, so it will be included
),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc-old", "env": "prod"}, nil),
@@ -234,7 +235,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, false, "svc-new1", "prod"),
createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, false, "svc-new2", "stage"),
),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{}, // all should be filtered out (new series)
},
@@ -261,7 +262,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, true, "svc-old1", "prod"),
createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, true, "svc-old2", "stage"),
),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc-old1", "env": "prod"}, nil),
@@ -295,7 +296,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
},
firstSeenMap: make(map[telemetrytypes.MetricMetadataLookupKey]int64),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
@@ -325,7 +326,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
},
firstSeenMap: make(map[telemetrytypes.MetricMetadataLookupKey]int64),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
@@ -361,7 +362,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createTestSeries(map[string]string{"status": "200"}, nil), // no service_name or env
},
firstSeenMap: make(map[telemetrytypes.MetricMetadataLookupKey]int64),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"status": "200"}, nil),
@@ -390,7 +391,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
},
firstSeenMap: createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, true, "svc-old", "prod"),
// svc-no-metadata has no entry in firstSeenMap
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc-old", "env": "prod"}, nil),
@@ -420,7 +421,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
{MetricName: "request_total", AttributeName: "service_name", AttributeValue: "svc-partial"}: calculateFirstSeen(defaultEvalTime, defaultDelay, true),
// env metadata is missing
},
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc-partial", "env": "prod"}, nil),
@@ -454,7 +455,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
},
series: []*v3.Series{},
firstSeenMap: make(map[telemetrytypes.MetricMetadataLookupKey]int64),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{},
},
@@ -488,7 +489,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
},
firstSeenMap: createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, true, "svc1", "prod"),
newGroupEvalDelay: func() *time.Duration { d := time.Duration(0); return &d }(), // zero delay
newGroupEvalDelay: valuer.TextDuration{}, // zero delay
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
@@ -532,7 +533,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createFirstSeenMap("request_total", defaultGroupByFields, defaultEvalTime, defaultDelay, true, "svc1", "prod"),
createFirstSeenMap("error_total", defaultGroupByFields, defaultEvalTime, defaultDelay, true, "svc1", "prod"),
),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc1", "env": "prod"}, nil),
@@ -572,7 +573,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createFirstSeenMap("request_total", []string{"service_name"}, defaultEvalTime, defaultDelay, true, "svc1"),
createFirstSeenMap("request_total", []string{"env"}, defaultEvalTime, defaultDelay, false, "prod"),
),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{}, // max first_seen is new, so should be filtered out
},
@@ -604,7 +605,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createTestSeries(map[string]string{"service_name": "svc2"}, nil),
},
firstSeenMap: make(map[telemetrytypes.MetricMetadataLookupKey]int64),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc1"}, nil),
@@ -639,7 +640,7 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
createTestSeries(map[string]string{"service_name": "svc2"}, nil),
},
firstSeenMap: make(map[telemetrytypes.MetricMetadataLookupKey]int64),
newGroupEvalDelay: &defaultDelay,
newGroupEvalDelay: defaultNewGroupEvalDelay,
evalTime: defaultEvalTime,
expectedFiltered: []*v3.Series{
createTestSeries(map[string]string{"service_name": "svc1"}, nil),
@@ -697,20 +698,14 @@ func TestBaseRule_FilterNewSeries(t *testing.T) {
telemetryStore,
prometheustest.New(context.Background(), settings, prometheus.Config{}, telemetryStore),
"",
time.Duration(time.Second),
time.Second,
nil,
readerCache,
options,
)
// Set newGroupEvalDelay in NotificationSettings if provided
if tt.newGroupEvalDelay != nil {
postableRule.NotificationSettings = &ruletypes.NotificationSettings{
NewGroupEvalDelay: func() *ruletypes.Duration {
d := ruletypes.Duration(*tt.newGroupEvalDelay)
return &d
}(),
}
postableRule.NotificationSettings = &ruletypes.NotificationSettings{
NewGroupEvalDelay: tt.newGroupEvalDelay,
}
// Create BaseRule using NewBaseRule

View File

@@ -30,7 +30,7 @@ import (
"github.com/SigNoz/signoz/pkg/types"
"github.com/SigNoz/signoz/pkg/types/alertmanagertypes"
"github.com/SigNoz/signoz/pkg/types/authtypes"
ruletypes "github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
@@ -66,7 +66,7 @@ type PrepareTestRuleOptions struct {
OrgID valuer.UUID
}
const taskNamesuffix = "webAppEditor"
const taskNameSuffix = "webAppEditor"
func RuleIdFromTaskName(n string) string {
return strings.Split(n, "-groupname")[0]
@@ -97,7 +97,7 @@ type ManagerOptions struct {
SLogger *slog.Logger
Cache cache.Cache
EvalDelay time.Duration
EvalDelay valuer.TextDuration
PrepareTaskFunc func(opts PrepareTaskOptions) (Task, error)
PrepareTestRuleFunc func(opts PrepareTestRuleOptions) (int, *model.ApiError)
@@ -182,8 +182,8 @@ func defaultPrepareTaskFunc(opts PrepareTaskOptions) (Task, error) {
rules = append(rules, tr)
// create ch rule task for evalution
task = newTask(TaskTypeCh, opts.TaskName, taskNamesuffix, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
// create ch rule task for evaluation
task = newTask(TaskTypeCh, opts.TaskName, taskNameSuffix, evaluation.GetFrequency().Duration(), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
} else if opts.Rule.RuleType == ruletypes.RuleTypeProm {
@@ -206,8 +206,8 @@ func defaultPrepareTaskFunc(opts PrepareTaskOptions) (Task, error) {
rules = append(rules, pr)
// create promql rule task for evalution
task = newTask(TaskTypeProm, opts.TaskName, taskNamesuffix, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
// create promql rule task for evaluation
task = newTask(TaskTypeProm, opts.TaskName, taskNameSuffix, evaluation.GetFrequency().Duration(), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
} else {
return nil, fmt.Errorf("unsupported rule type %s. Supported types: %s, %s", opts.Rule.RuleType, ruletypes.RuleTypeProm, ruletypes.RuleTypeThreshold)
@@ -323,7 +323,7 @@ func (m *Manager) run(_ context.Context) {
}
// Stop the rule manager's rule evaluation cycles.
func (m *Manager) Stop(ctx context.Context) {
func (m *Manager) Stop(_ context.Context) {
m.mtx.Lock()
defer m.mtx.Unlock()
@@ -336,7 +336,7 @@ func (m *Manager) Stop(ctx context.Context) {
zap.L().Info("Rule manager stopped")
}
// EditRuleDefinition writes the rule definition to the
// EditRule writes the rule definition to the
// datastore and also updates the rule executor
func (m *Manager) EditRule(ctx context.Context, ruleStr string, id valuer.UUID) error {
claims, err := authtypes.ClaimsFromContext(ctx)
@@ -643,7 +643,7 @@ func (m *Manager) addTask(_ context.Context, orgID valuer.UUID, rule *ruletypes.
m.rules[r.ID()] = r
}
// If there is an another task with the same identifier, raise an error
// If there is another task with the same identifier, raise an error
_, ok := m.tasks[taskName]
if ok {
return fmt.Errorf("a rule with the same name already exists")
@@ -678,7 +678,8 @@ func (m *Manager) RuleTasks() []Task {
return rgs
}
// RuleTasks returns the list of manager's rule tasks.
// RuleTasksWithoutLock returns the list of manager's rule tasks without
// acquiring a lock on the manager.
func (m *Manager) RuleTasksWithoutLock() []Task {
rgs := make([]Task, 0, len(m.tasks))
@@ -889,7 +890,7 @@ func (m *Manager) syncRuleStateWithTask(ctx context.Context, orgID valuer.UUID,
} else {
// check if rule has a task running
if _, ok := m.tasks[taskName]; !ok {
// rule has not task, start one
// rule has no task, start one
if err := m.addTask(ctx, orgID, rule, taskName); err != nil {
return err
}

View File

@@ -9,6 +9,7 @@ import (
qbtypes "github.com/SigNoz/signoz/pkg/types/querybuildertypes/querybuildertypesv5"
ruletypes "github.com/SigNoz/signoz/pkg/types/ruletypes"
"github.com/SigNoz/signoz/pkg/types/telemetrytypes"
"github.com/SigNoz/signoz/pkg/valuer"
)
// ThresholdRuleTestCase defines test case structure for threshold rule test notifications
@@ -40,8 +41,8 @@ func ThresholdRuleAtLeastOnceValueAbove(target float64, recovery *float64) rulet
AlertType: ruletypes.AlertTypeMetric,
RuleType: ruletypes.RuleTypeThreshold,
Evaluation: &ruletypes.EvaluationEnvelope{Kind: ruletypes.RollingEvaluation, Spec: ruletypes.RollingWindow{
EvalWindow: ruletypes.Duration(5 * time.Minute),
Frequency: ruletypes.Duration(1 * time.Minute),
EvalWindow: valuer.MustParseTextDuration("5m"),
Frequency: valuer.MustParseTextDuration("1m"),
}},
Labels: map[string]string{
"service.name": "frontend",
@@ -99,8 +100,8 @@ func BuildPromAtLeastOnceValueAbove(target float64, recovery *float64) ruletypes
AlertType: ruletypes.AlertTypeMetric,
RuleType: ruletypes.RuleTypeProm,
Evaluation: &ruletypes.EvaluationEnvelope{Kind: ruletypes.RollingEvaluation, Spec: ruletypes.RollingWindow{
EvalWindow: ruletypes.Duration(5 * time.Minute),
Frequency: ruletypes.Duration(1 * time.Minute),
EvalWindow: valuer.MustParseTextDuration("5m"),
Frequency: valuer.MustParseTextDuration("1m"),
}},
Labels: map[string]string{
"service.name": "frontend",

View File

@@ -28,6 +28,8 @@ type PromRule struct {
prometheus prometheus.Prometheus
}
var _ Rule = (*PromRule)(nil)
func NewPromRule(
id string,
orgID valuer.UUID,
@@ -332,7 +334,7 @@ func (r *PromRule) Eval(ctx context.Context, ts time.Time) (int, error) {
continue
}
if a.State == model.StatePending && ts.Sub(a.ActiveAt) >= r.holdDuration {
if a.State == model.StatePending && ts.Sub(a.ActiveAt) >= r.holdDuration.Duration() {
a.State = model.StateFiring
a.FiredAt = ts
state := model.StateFiring
@@ -396,7 +398,7 @@ func (r *PromRule) String() string {
ar := ruletypes.PostableRule{
AlertName: r.name,
RuleCondition: r.ruleCondition,
EvalWindow: ruletypes.Duration(r.evalWindow),
EvalWindow: r.evalWindow,
Labels: r.labels.Map(),
Annotations: r.annotations.Map(),
PreferredChannels: r.preferredChannels,

View File

@@ -41,12 +41,12 @@ type PromRuleTask struct {
orgID valuer.UUID
}
// newPromRuleTask holds rules that have promql condition
// and evalutes the rule at a given frequency
// NewPromRuleTask holds rules that have promql condition
// and evaluates the rule at a given frequency
func NewPromRuleTask(name, file string, frequency time.Duration, rules []Rule, opts *ManagerOptions, notify NotifyFunc, maintenanceStore ruletypes.MaintenanceStore, orgID valuer.UUID) *PromRuleTask {
zap.L().Info("Initiating a new rule group", zap.String("name", name), zap.Duration("frequency", frequency))
if time.Now() == time.Now().Add(frequency) {
if frequency == 0 {
frequency = DefaultFrequency
}

Some files were not shown because too many files have changed in this diff Show More