mirror of
https://github.com/SigNoz/signoz.git
synced 2026-02-07 18:32:12 +00:00
Compare commits
33 Commits
integrate-
...
ns/claude-
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
08c53fe7e8 | ||
|
|
c1fac00d2e | ||
|
|
15161c09e8 | ||
|
|
ee5fbe41eb | ||
|
|
f2f3a7b24a | ||
|
|
dd0738ac70 | ||
|
|
1c4dfc931f | ||
|
|
605d6ba17d | ||
|
|
f017b07525 | ||
|
|
b5901ac174 | ||
|
|
caacbc086c | ||
|
|
9d06ccab48 | ||
|
|
de45292782 | ||
|
|
9f38305e5a | ||
|
|
e1c8b68cd2 | ||
|
|
76ec089a43 | ||
|
|
4117c7442b | ||
|
|
4153515767 | ||
|
|
6a0ee32616 | ||
|
|
1f13b60703 | ||
|
|
0865d2edaf | ||
|
|
8629c959f0 | ||
|
|
10760e6e1b | ||
|
|
4f45645b32 | ||
|
|
1417e22ae4 | ||
|
|
3051d442c0 | ||
|
|
ea15ce4e04 | ||
|
|
865a7a5a31 | ||
|
|
de4ca50a40 | ||
|
|
8cabaafc58 | ||
|
|
e9d66b8094 | ||
|
|
26d3d6b1e4 | ||
|
|
36d6debeab |
136
.claude/CLAUDE.md
Normal file
136
.claude/CLAUDE.md
Normal file
@@ -0,0 +1,136 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
SigNoz is an open-source observability platform (APM, logs, metrics, traces) built on OpenTelemetry and ClickHouse. It provides a unified solution for monitoring applications with features including distributed tracing, log management, metrics dashboards, and alerting.
|
||||
|
||||
## Build and Development Commands
|
||||
|
||||
### Development Environment Setup
|
||||
```bash
|
||||
make devenv-up # Start ClickHouse and OTel Collector for local dev
|
||||
make devenv-clickhouse # Start only ClickHouse
|
||||
make devenv-signoz-otel-collector # Start only OTel Collector
|
||||
make devenv-clickhouse-clean # Clean ClickHouse data
|
||||
```
|
||||
|
||||
### Backend (Go)
|
||||
```bash
|
||||
make go-run-community # Run community backend server
|
||||
make go-run-enterprise # Run enterprise backend server
|
||||
make go-test # Run all Go unit tests
|
||||
go test -race ./pkg/... # Run tests for specific package
|
||||
go test -race ./pkg/querier/... # Example: run querier tests
|
||||
```
|
||||
|
||||
### Integration Tests (Python)
|
||||
```bash
|
||||
cd tests/integration
|
||||
uv sync # Install dependencies
|
||||
make py-test-setup # Start test environment (keep running with --reuse)
|
||||
make py-test # Run all integration tests
|
||||
make py-test-teardown # Stop test environment
|
||||
|
||||
# Run specific test
|
||||
uv run pytest --basetemp=./tmp/ -vv --reuse src/<suite>/<file>.py::test_name
|
||||
```
|
||||
|
||||
### Code Quality
|
||||
```bash
|
||||
# Go linting (golangci-lint)
|
||||
golangci-lint run
|
||||
|
||||
# Python formatting/linting
|
||||
make py-fmt # Format with black
|
||||
make py-lint # Run isort, autoflake, pylint
|
||||
```
|
||||
|
||||
### OpenAPI Generation
|
||||
```bash
|
||||
go run cmd/enterprise/*.go generate openapi
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
### Backend Structure
|
||||
|
||||
The Go backend follows a **provider pattern** for dependency injection:
|
||||
|
||||
- **`pkg/signoz/`** - IoC container that wires all providers together
|
||||
- **`pkg/modules/`** - Business logic modules (user, organization, dashboard, etc.)
|
||||
- **`pkg/<provider>/`** - Provider implementations following consistent structure:
|
||||
- `<name>.go` - Interface definition
|
||||
- `config.go` - Configuration (implements `factory.Config`)
|
||||
- `<implname><name>/provider.go` - Implementation
|
||||
- `<name>test/` - Mock implementations for testing
|
||||
|
||||
### Key Packages
|
||||
- **`pkg/querier/`** - Query engine for telemetry data (logs, traces, metrics)
|
||||
- **`pkg/telemetrystore/`** - ClickHouse telemetry storage interface
|
||||
- **`pkg/sqlstore/`** - Relational database (SQLite/PostgreSQL) for metadata
|
||||
- **`pkg/apiserver/`** - HTTP API server with OpenAPI integration
|
||||
- **`pkg/alertmanager/`** - Alert management
|
||||
- **`pkg/authn/`, `pkg/authz/`** - Authentication and authorization
|
||||
- **`pkg/flagger/`** - Feature flags (OpenFeature-based)
|
||||
- **`pkg/errors/`** - Structured error handling
|
||||
|
||||
### Enterprise vs Community
|
||||
- **`cmd/community/`** - Community edition entry point
|
||||
- **`cmd/enterprise/`** - Enterprise edition entry point
|
||||
- **`ee/`** - Enterprise-only features
|
||||
|
||||
## Code Conventions
|
||||
|
||||
### Error Handling
|
||||
Use the custom `pkg/errors` package instead of standard library:
|
||||
```go
|
||||
errors.New(typ, code, message) // Instead of errors.New()
|
||||
errors.Newf(typ, code, message, args...) // Instead of fmt.Errorf()
|
||||
errors.Wrapf(err, typ, code, msg) // Wrap with context
|
||||
```
|
||||
|
||||
Define domain-specific error codes:
|
||||
```go
|
||||
var CodeThingNotFound = errors.MustNewCode("thing_not_found")
|
||||
```
|
||||
|
||||
### HTTP Handlers
|
||||
Handlers are thin adapters in modules that:
|
||||
1. Extract auth context from request
|
||||
2. Decode request body using `binding` package
|
||||
3. Call module functions
|
||||
4. Return responses using `render` package
|
||||
|
||||
Register routes in `pkg/apiserver/signozapiserver/` with `handler.New()` and `OpenAPIDef`.
|
||||
|
||||
### SQL/Database
|
||||
- Use Bun ORM via `sqlstore.BunDBCtx(ctx)`
|
||||
- Star schema with `organizations` as central entity
|
||||
- All tables have `id`, `created_at`, `updated_at`, `org_id` columns
|
||||
- Write idempotent migrations in `pkg/sqlmigration/`
|
||||
- No `ON CASCADE` deletes - handle in application logic
|
||||
|
||||
### REST Endpoints
|
||||
- Use plural resource names: `/v1/organizations`, `/v1/users`
|
||||
- Use `me` for current user/org: `/v1/organizations/me/users`
|
||||
- Follow RESTful conventions for CRUD operations
|
||||
|
||||
### Linting Rules (from .golangci.yml)
|
||||
- Don't use `errors` package - use `pkg/errors`
|
||||
- Don't use `zap` logger - use `slog`
|
||||
- Don't use `fmt.Errorf` or `fmt.Print*`
|
||||
|
||||
## Testing
|
||||
|
||||
### Unit Tests
|
||||
- Run with race detector: `go test -race ./...`
|
||||
- Provider mocks are in `<provider>test/` packages
|
||||
|
||||
### Integration Tests
|
||||
- Located in `tests/integration/`
|
||||
- Use pytest with testcontainers
|
||||
- Files prefixed with numbers for execution order (e.g., `01_database.py`)
|
||||
- Always use `--reuse` flag during development
|
||||
- Fixtures in `tests/integration/fixtures/`
|
||||
36
.claude/skills/commit/SKILL.md
Normal file
36
.claude/skills/commit/SKILL.md
Normal file
@@ -0,0 +1,36 @@
|
||||
---
|
||||
name: commit
|
||||
description: Create a conventional commit with staged changes
|
||||
disable-model-invocation: true
|
||||
allowed-tools: Bash(git:*)
|
||||
---
|
||||
|
||||
# Create Conventional Commit
|
||||
|
||||
Commit staged changes using conventional commit format: `type(scope): description`
|
||||
|
||||
## Types
|
||||
|
||||
- `feat:` - New feature
|
||||
- `fix:` - Bug fix
|
||||
- `chore:` - Maintenance/refactor/tooling
|
||||
- `test:` - Tests only
|
||||
- `docs:` - Documentation
|
||||
|
||||
## Process
|
||||
|
||||
1. Review staged changes: `git diff --cached`
|
||||
2. Determine type, optional scope, and description (imperative, <70 chars)
|
||||
3. Commit using HEREDOC:
|
||||
```bash
|
||||
git commit -m "$(cat <<'EOF'
|
||||
type(scope): description
|
||||
EOF
|
||||
)"
|
||||
```
|
||||
4. Verify: `git log -1`
|
||||
|
||||
## Notes
|
||||
|
||||
- Description: imperative mood, lowercase, no period
|
||||
- Body: explain WHY, not WHAT (code shows what)
|
||||
55
.claude/skills/raise-pr/SKILL.md
Normal file
55
.claude/skills/raise-pr/SKILL.md
Normal file
@@ -0,0 +1,55 @@
|
||||
---
|
||||
name: raise-pr
|
||||
description: Create a pull request with auto-filled template. Pass 'commit' to commit staged changes first.
|
||||
disable-model-invocation: true
|
||||
allowed-tools: Bash(gh:*, git:*), Read
|
||||
argument-hint: [commit?]
|
||||
---
|
||||
|
||||
# Raise Pull Request
|
||||
|
||||
Create a PR with auto-filled template from commits after origin/main.
|
||||
|
||||
## Arguments
|
||||
|
||||
- No argument: Create PR with existing commits
|
||||
- `commit`: Commit staged changes first, then create PR
|
||||
|
||||
## Process
|
||||
|
||||
1. **If `$ARGUMENTS` is "commit"**: Review staged changes and commit with descriptive message
|
||||
- Check for staged changes: `git diff --cached --stat`
|
||||
- If changes exist:
|
||||
- Review the changes: `git diff --cached`
|
||||
- Create a short and clear commit message based on the changes
|
||||
- Commit command: `git commit -m "message"`
|
||||
|
||||
2. **Analyze commits since origin/main**:
|
||||
- `git log origin/main..HEAD --pretty=format:"%s%n%b"` - get commit messages
|
||||
- `git diff origin/main...HEAD --stat` - see changes
|
||||
|
||||
3. **Read template**: `.github/pull_request_template.md`
|
||||
|
||||
4. **Generate PR**:
|
||||
- **Title**: Short (<70 chars), from commit messages or main change
|
||||
- **Body**: Fill template sections based on commits/changes:
|
||||
- Summary (why/what/approach) - end with "Closes #<issue_number>" if issue number is available from branch name (git branch --show-current)
|
||||
- Change Type checkboxes
|
||||
- Bug Context (if applicable)
|
||||
- Testing Strategy
|
||||
- Risk Assessment
|
||||
- Changelog (if user-facing)
|
||||
- Checklist
|
||||
|
||||
5. **Create PR**:
|
||||
```bash
|
||||
git push -u origin $(git branch --show-current)
|
||||
gh pr create --base main --title "..." --body "..."
|
||||
gh pr view
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- Analyze ALL commits messages from origin/main to HEAD
|
||||
- Fill template sections based on code analysis
|
||||
- Leave the sections of PR template as it is if you can't determine
|
||||
8
.github/CODEOWNERS
vendored
8
.github/CODEOWNERS
vendored
@@ -103,10 +103,18 @@
|
||||
|
||||
/tests/integration/ @vikrantgupta25
|
||||
|
||||
# OpenAPI types generator
|
||||
|
||||
/frontend/src/api @SigNoz/frontend-maintainers
|
||||
|
||||
# Dashboard Owners
|
||||
|
||||
/frontend/src/hooks/dashboard/ @SigNoz/pulse-frontend
|
||||
|
||||
## Dashboard Types
|
||||
|
||||
/frontend/src/api/types/dashboard/ @SigNoz/pulse-frontend
|
||||
|
||||
## Dashboard List
|
||||
|
||||
/frontend/src/pages/DashboardsListPage/ @SigNoz/pulse-frontend
|
||||
|
||||
9
.github/pull_request_template.md
vendored
9
.github/pull_request_template.md
vendored
@@ -6,6 +6,15 @@
|
||||
> Why does this change exist?
|
||||
> What problem does it solve, and why is this the right approach?
|
||||
|
||||
|
||||
|
||||
#### Screenshots / Screen Recordings (if applicable)
|
||||
> Include screenshots or screen recordings that clearly show the behavior before the change and the result after the change. This helps reviewers quickly understand the impact and verify the update.
|
||||
|
||||
|
||||
#### Issues closed by this PR
|
||||
> Reference issues using `Closes #issue-number` to enable automatic closure on merge.
|
||||
|
||||
---
|
||||
|
||||
### ✅ Change Type
|
||||
|
||||
7
.github/workflows/commitci.yaml
vendored
7
.github/workflows/commitci.yaml
vendored
@@ -25,3 +25,10 @@ jobs:
|
||||
else
|
||||
echo "No references to 'ee' packages found in 'pkg' directory"
|
||||
fi
|
||||
|
||||
if grep -R --include="*.go" '.*/ee/.*' cmd/community/; then
|
||||
echo "Error: Found references to 'ee' packages in 'cmd/community' directory"
|
||||
exit 1
|
||||
else
|
||||
echo "No references to 'ee' packages found in 'cmd/community' directory"
|
||||
fi
|
||||
|
||||
4
.github/workflows/goci.yaml
vendored
4
.github/workflows/goci.yaml
vendored
@@ -65,6 +65,10 @@ jobs:
|
||||
set -ex
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y gcc-aarch64-linux-gnu musl-tools
|
||||
- name: node-install
|
||||
uses: actions/setup-node@v5
|
||||
with:
|
||||
node-version: "22"
|
||||
- name: docker-community
|
||||
shell: bash
|
||||
run: |
|
||||
|
||||
5
.github/workflows/integrationci.yaml
vendored
5
.github/workflows/integrationci.yaml
vendored
@@ -84,8 +84,11 @@ jobs:
|
||||
sudo rm /etc/apt/sources.list.d/google-chrome.list
|
||||
export CHROMEDRIVER_VERSION=`curl -s https://googlechromelabs.github.io/chrome-for-testing/LATEST_RELEASE_${CHROME_MAJOR_VERSION%%.*}`
|
||||
curl -L -O "https://storage.googleapis.com/chrome-for-testing-public/${CHROMEDRIVER_VERSION}/linux64/chromedriver-linux64.zip"
|
||||
unzip chromedriver-linux64.zip && chmod +x chromedriver && sudo mv chromedriver /usr/local/bin
|
||||
unzip chromedriver-linux64.zip
|
||||
chmod +x chromedriver-linux64/chromedriver
|
||||
sudo mv chromedriver-linux64/chromedriver /usr/local/bin/chromedriver
|
||||
chromedriver -version
|
||||
google-chrome-stable --version
|
||||
- name: run
|
||||
run: |
|
||||
cd tests/integration && \
|
||||
|
||||
13
.github/workflows/jsci.yaml
vendored
13
.github/workflows/jsci.yaml
vendored
@@ -17,10 +17,23 @@ jobs:
|
||||
steps:
|
||||
- name: checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: setup node
|
||||
uses: actions/setup-node@v5
|
||||
with:
|
||||
node-version: "22"
|
||||
- name: install
|
||||
run: cd frontend && yarn install
|
||||
- name: tsc
|
||||
run: cd frontend && yarn tsc
|
||||
tsc2:
|
||||
if: |
|
||||
(github.event_name == 'pull_request' && ! github.event.pull_request.head.repo.fork && github.event.pull_request.user.login != 'dependabot[bot]' && ! contains(github.event.pull_request.labels.*.name, 'safe-to-test')) ||
|
||||
(github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'safe-to-test'))
|
||||
uses: signoz/primus.workflows/.github/workflows/js-tsc.yaml@main
|
||||
secrets: inherit
|
||||
with:
|
||||
PRIMUS_REF: main
|
||||
JS_SRC: frontend
|
||||
test:
|
||||
if: |
|
||||
(github.event_name == 'pull_request' && ! github.event.pull_request.head.repo.fork && github.event.pull_request.user.login != 'dependabot[bot]' && ! contains(github.event.pull_request.labels.*.name, 'safe-to-test')) ||
|
||||
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -1,6 +1,9 @@
|
||||
|
||||
node_modules
|
||||
|
||||
.vscode
|
||||
!.vscode/settings.json
|
||||
|
||||
deploy/docker/environment_tiny/common_test
|
||||
frontend/node_modules
|
||||
frontend/.pnp
|
||||
@@ -104,7 +107,6 @@ dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
|
||||
@@ -12,6 +12,7 @@ linters:
|
||||
- misspell
|
||||
- nilnil
|
||||
- sloglint
|
||||
- wastedassign
|
||||
- unparam
|
||||
- unused
|
||||
settings:
|
||||
|
||||
13
.mockery.yml
13
.mockery.yml
@@ -4,7 +4,14 @@ packages:
|
||||
github.com/SigNoz/signoz/pkg/alertmanager:
|
||||
config:
|
||||
all: true
|
||||
dir: '{{.InterfaceDir}}/mocks'
|
||||
filename: "mocks.go"
|
||||
dir: '{{.InterfaceDir}}/alertmanagertest'
|
||||
filename: "alertmanager.go"
|
||||
structname: 'Mock{{.InterfaceName}}'
|
||||
pkgname: '{{.SrcPackageName}}mock'
|
||||
pkgname: '{{.SrcPackageName}}test'
|
||||
github.com/SigNoz/signoz/pkg/tokenizer:
|
||||
config:
|
||||
all: true
|
||||
dir: '{{.InterfaceDir}}/tokenizertest'
|
||||
filename: "tokenizer.go"
|
||||
structname: 'Mock{{.InterfaceName}}'
|
||||
pkgname: '{{.SrcPackageName}}test'
|
||||
|
||||
6
.vscode/settings.json
vendored
6
.vscode/settings.json
vendored
@@ -5,5 +5,9 @@
|
||||
"editor.codeActionsOnSave": {
|
||||
"source.fixAll.eslint": "explicit"
|
||||
},
|
||||
"prettier.requireConfig": true
|
||||
"prettier.requireConfig": true,
|
||||
"[go]": {
|
||||
"editor.formatOnSave": true,
|
||||
"editor.defaultFormatter": "golang.go"
|
||||
}
|
||||
}
|
||||
|
||||
9
Makefile
9
Makefile
@@ -230,3 +230,12 @@ py-clean: ## Clear all pycache and pytest cache from tests directory recursively
|
||||
@find tests -type f -name "*.pyc" -delete 2>/dev/null || true
|
||||
@find tests -type f -name "*.pyo" -delete 2>/dev/null || true
|
||||
@echo ">> python cache cleaned"
|
||||
|
||||
|
||||
##############################################################
|
||||
# generate commands
|
||||
##############################################################
|
||||
.PHONY: gen-mocks
|
||||
gen-mocks:
|
||||
@echo ">> Generating mocks"
|
||||
@mockery --config .mockery.yml
|
||||
@@ -5,13 +5,14 @@ import (
|
||||
"log/slog"
|
||||
|
||||
"github.com/SigNoz/signoz/cmd"
|
||||
"github.com/SigNoz/signoz/ee/authz/openfgaauthz"
|
||||
"github.com/SigNoz/signoz/ee/authz/openfgaschema"
|
||||
"github.com/SigNoz/signoz/ee/sqlstore/postgressqlstore"
|
||||
"github.com/SigNoz/signoz/pkg/analytics"
|
||||
"github.com/SigNoz/signoz/pkg/authn"
|
||||
"github.com/SigNoz/signoz/pkg/authz"
|
||||
"github.com/SigNoz/signoz/pkg/authz/openfgaauthz"
|
||||
"github.com/SigNoz/signoz/pkg/authz/openfgaschema"
|
||||
"github.com/SigNoz/signoz/pkg/factory"
|
||||
"github.com/SigNoz/signoz/pkg/gateway"
|
||||
"github.com/SigNoz/signoz/pkg/gateway/noopgateway"
|
||||
"github.com/SigNoz/signoz/pkg/licensing"
|
||||
"github.com/SigNoz/signoz/pkg/licensing/nooplicensing"
|
||||
"github.com/SigNoz/signoz/pkg/modules/dashboard"
|
||||
@@ -24,7 +25,6 @@ import (
|
||||
"github.com/SigNoz/signoz/pkg/signoz"
|
||||
"github.com/SigNoz/signoz/pkg/sqlschema"
|
||||
"github.com/SigNoz/signoz/pkg/sqlstore"
|
||||
"github.com/SigNoz/signoz/pkg/sqlstore/sqlstorehook"
|
||||
"github.com/SigNoz/signoz/pkg/types/authtypes"
|
||||
"github.com/SigNoz/signoz/pkg/version"
|
||||
"github.com/SigNoz/signoz/pkg/zeus"
|
||||
@@ -57,13 +57,6 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
|
||||
// print the version
|
||||
version.Info.PrettyPrint(config.Version)
|
||||
|
||||
// add enterprise sqlstore factories to the community sqlstore factories
|
||||
sqlstoreFactories := signoz.NewSQLStoreProviderFactories()
|
||||
if err := sqlstoreFactories.Add(postgressqlstore.NewFactory(sqlstorehook.NewLoggingFactory())); err != nil {
|
||||
logger.ErrorContext(ctx, "failed to add postgressqlstore factory", "error", err)
|
||||
return err
|
||||
}
|
||||
|
||||
signoz, err := signoz.New(
|
||||
ctx,
|
||||
config,
|
||||
@@ -90,6 +83,9 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
|
||||
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, _ role.Module, queryParser queryparser.QueryParser, _ querier.Querier, _ licensing.Licensing) dashboard.Module {
|
||||
return impldashboard.NewModule(impldashboard.NewStore(store), settings, analytics, orgGetter, queryParser)
|
||||
},
|
||||
func(_ licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
|
||||
return noopgateway.NewProviderFactory()
|
||||
},
|
||||
)
|
||||
if err != nil {
|
||||
logger.ErrorContext(ctx, "failed to create signoz", "error", err)
|
||||
|
||||
@@ -10,6 +10,7 @@ import (
|
||||
"github.com/SigNoz/signoz/ee/authn/callbackauthn/samlcallbackauthn"
|
||||
"github.com/SigNoz/signoz/ee/authz/openfgaauthz"
|
||||
"github.com/SigNoz/signoz/ee/authz/openfgaschema"
|
||||
"github.com/SigNoz/signoz/ee/gateway/httpgateway"
|
||||
enterpriselicensing "github.com/SigNoz/signoz/ee/licensing"
|
||||
"github.com/SigNoz/signoz/ee/licensing/httplicensing"
|
||||
"github.com/SigNoz/signoz/ee/modules/dashboard/impldashboard"
|
||||
@@ -22,6 +23,7 @@ import (
|
||||
"github.com/SigNoz/signoz/pkg/authn"
|
||||
"github.com/SigNoz/signoz/pkg/authz"
|
||||
"github.com/SigNoz/signoz/pkg/factory"
|
||||
"github.com/SigNoz/signoz/pkg/gateway"
|
||||
"github.com/SigNoz/signoz/pkg/licensing"
|
||||
"github.com/SigNoz/signoz/pkg/modules/dashboard"
|
||||
pkgimpldashboard "github.com/SigNoz/signoz/pkg/modules/dashboard/impldashboard"
|
||||
@@ -120,6 +122,9 @@ func runServer(ctx context.Context, config signoz.Config, logger *slog.Logger) e
|
||||
func(store sqlstore.SQLStore, settings factory.ProviderSettings, analytics analytics.Analytics, orgGetter organization.Getter, role role.Module, queryParser queryparser.QueryParser, querier querier.Querier, licensing licensing.Licensing) dashboard.Module {
|
||||
return impldashboard.NewModule(pkgimpldashboard.NewStore(store), settings, analytics, orgGetter, role, queryParser, querier, licensing)
|
||||
},
|
||||
func(licensing licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
|
||||
return httpgateway.NewProviderFactory(licensing)
|
||||
},
|
||||
)
|
||||
if err != nil {
|
||||
logger.ErrorContext(ctx, "failed to create signoz", "error", err)
|
||||
|
||||
@@ -176,7 +176,7 @@ services:
|
||||
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
|
||||
signoz:
|
||||
!!merge <<: *db-depend
|
||||
image: signoz/signoz:v0.107.0
|
||||
image: signoz/signoz:v0.108.0
|
||||
command:
|
||||
- --config=/root/config/prometheus.yml
|
||||
ports:
|
||||
|
||||
@@ -117,7 +117,7 @@ services:
|
||||
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
|
||||
signoz:
|
||||
!!merge <<: *db-depend
|
||||
image: signoz/signoz:v0.107.0
|
||||
image: signoz/signoz:v0.108.0
|
||||
command:
|
||||
- --config=/root/config/prometheus.yml
|
||||
ports:
|
||||
|
||||
@@ -179,7 +179,7 @@ services:
|
||||
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
|
||||
signoz:
|
||||
!!merge <<: *db-depend
|
||||
image: signoz/signoz:${VERSION:-v0.107.0}
|
||||
image: signoz/signoz:${VERSION:-v0.108.0}
|
||||
container_name: signoz
|
||||
command:
|
||||
- --config=/root/config/prometheus.yml
|
||||
|
||||
@@ -111,7 +111,7 @@ services:
|
||||
# - ../common/clickhouse/storage.xml:/etc/clickhouse-server/config.d/storage.xml
|
||||
signoz:
|
||||
!!merge <<: *db-depend
|
||||
image: signoz/signoz:${VERSION:-v0.107.0}
|
||||
image: signoz/signoz:${VERSION:-v0.108.0}
|
||||
container_name: signoz
|
||||
command:
|
||||
- --config=/root/config/prometheus.yml
|
||||
|
||||
@@ -607,186 +607,6 @@ paths:
|
||||
summary: Update auth domain
|
||||
tags:
|
||||
- authdomains
|
||||
/api/v1/fields/keys:
|
||||
get:
|
||||
deprecated: false
|
||||
description: This endpoint returns field keys
|
||||
operationId: GetFieldsKeys
|
||||
parameters:
|
||||
- in: query
|
||||
name: signal
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: source
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: limit
|
||||
schema:
|
||||
type: integer
|
||||
- in: query
|
||||
name: startUnixMilli
|
||||
schema:
|
||||
format: int64
|
||||
type: integer
|
||||
- in: query
|
||||
name: endUnixMilli
|
||||
schema:
|
||||
format: int64
|
||||
type: integer
|
||||
- in: query
|
||||
name: fieldContext
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: fieldDataType
|
||||
schema:
|
||||
type: string
|
||||
- content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/TelemetrytypesMetricContext'
|
||||
in: query
|
||||
name: metricContext
|
||||
- in: query
|
||||
name: name
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: searchText
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/TelemetrytypesGettableFieldKeys'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: OK
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- VIEWER
|
||||
- tokenizer:
|
||||
- VIEWER
|
||||
summary: Get field keys
|
||||
tags:
|
||||
- fields
|
||||
/api/v1/fields/values:
|
||||
get:
|
||||
deprecated: false
|
||||
description: This endpoint returns field values
|
||||
operationId: GetFieldsValues
|
||||
parameters:
|
||||
- in: query
|
||||
name: signal
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: source
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: limit
|
||||
schema:
|
||||
type: integer
|
||||
- in: query
|
||||
name: startUnixMilli
|
||||
schema:
|
||||
format: int64
|
||||
type: integer
|
||||
- in: query
|
||||
name: endUnixMilli
|
||||
schema:
|
||||
format: int64
|
||||
type: integer
|
||||
- in: query
|
||||
name: fieldContext
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: fieldDataType
|
||||
schema:
|
||||
type: string
|
||||
- content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/TelemetrytypesMetricContext'
|
||||
in: query
|
||||
name: metricContext
|
||||
- in: query
|
||||
name: name
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: searchText
|
||||
schema:
|
||||
type: string
|
||||
- in: query
|
||||
name: existingQuery
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/TelemetrytypesGettableFieldValues'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: OK
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- VIEWER
|
||||
- tokenizer:
|
||||
- VIEWER
|
||||
summary: Get field values
|
||||
tags:
|
||||
- fields
|
||||
/api/v1/getResetPasswordToken/{id}:
|
||||
get:
|
||||
deprecated: false
|
||||
@@ -1187,43 +1007,6 @@ paths:
|
||||
summary: Create bulk invite
|
||||
tags:
|
||||
- users
|
||||
/api/v1/login:
|
||||
post:
|
||||
deprecated: true
|
||||
description: This endpoint is deprecated and will be removed in the future
|
||||
operationId: DeprecatedCreateSessionByEmailPassword
|
||||
requestBody:
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/AuthtypesDeprecatedPostableLogin'
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/AuthtypesDeprecatedGettableLogin'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: OK
|
||||
"400":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Bad Request
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
summary: Deprecated create session by email password
|
||||
tags:
|
||||
- sessions
|
||||
/api/v1/logs/promote_paths:
|
||||
get:
|
||||
deprecated: false
|
||||
@@ -2247,11 +2030,371 @@ paths:
|
||||
summary: Get features
|
||||
tags:
|
||||
- features
|
||||
/api/v2/gateway/ingestion_keys:
|
||||
get:
|
||||
deprecated: false
|
||||
description: This endpoint returns the ingestion keys for a workspace
|
||||
operationId: GetIngestionKeys
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/GatewaytypesGettableIngestionKeys'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: OK
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Get ingestion keys for workspace
|
||||
tags:
|
||||
- gateway
|
||||
post:
|
||||
deprecated: false
|
||||
description: This endpoint creates an ingestion key for the workspace
|
||||
operationId: CreateIngestionKey
|
||||
requestBody:
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/GatewaytypesPostableIngestionKey'
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/GatewaytypesGettableCreatedIngestionKey'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: OK
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Create ingestion key for workspace
|
||||
tags:
|
||||
- gateway
|
||||
/api/v2/gateway/ingestion_keys/{keyId}:
|
||||
delete:
|
||||
deprecated: false
|
||||
description: This endpoint deletes an ingestion key for the workspace
|
||||
operationId: DeleteIngestionKey
|
||||
parameters:
|
||||
- in: path
|
||||
name: keyId
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"204":
|
||||
description: No Content
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Delete ingestion key for workspace
|
||||
tags:
|
||||
- gateway
|
||||
patch:
|
||||
deprecated: false
|
||||
description: This endpoint updates an ingestion key for the workspace
|
||||
operationId: UpdateIngestionKey
|
||||
parameters:
|
||||
- in: path
|
||||
name: keyId
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
requestBody:
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/GatewaytypesPostableIngestionKey'
|
||||
responses:
|
||||
"204":
|
||||
description: No Content
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Update ingestion key for workspace
|
||||
tags:
|
||||
- gateway
|
||||
/api/v2/gateway/ingestion_keys/{keyId}/limits:
|
||||
post:
|
||||
deprecated: false
|
||||
description: This endpoint creates an ingestion key limit
|
||||
operationId: CreateIngestionKeyLimit
|
||||
parameters:
|
||||
- in: path
|
||||
name: keyId
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
requestBody:
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/GatewaytypesPostableIngestionKeyLimit'
|
||||
responses:
|
||||
"201":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/GatewaytypesGettableCreatedIngestionKeyLimit'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: Created
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Create limit for the ingestion key
|
||||
tags:
|
||||
- gateway
|
||||
/api/v2/gateway/ingestion_keys/limits/{limitId}:
|
||||
delete:
|
||||
deprecated: false
|
||||
description: This endpoint deletes an ingestion key limit
|
||||
operationId: DeleteIngestionKeyLimit
|
||||
parameters:
|
||||
- in: path
|
||||
name: limitId
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"204":
|
||||
description: No Content
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Delete limit for the ingestion key
|
||||
tags:
|
||||
- gateway
|
||||
patch:
|
||||
deprecated: false
|
||||
description: This endpoint updates an ingestion key limit
|
||||
operationId: UpdateIngestionKeyLimit
|
||||
parameters:
|
||||
- in: path
|
||||
name: limitId
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
requestBody:
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/GatewaytypesUpdatableIngestionKeyLimit'
|
||||
responses:
|
||||
"204":
|
||||
description: No Content
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Update limit for the ingestion key
|
||||
tags:
|
||||
- gateway
|
||||
/api/v2/gateway/ingestion_keys/search:
|
||||
get:
|
||||
deprecated: false
|
||||
description: This endpoint returns the ingestion keys for a workspace
|
||||
operationId: SearchIngestionKeys
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
properties:
|
||||
data:
|
||||
$ref: '#/components/schemas/GatewaytypesGettableIngestionKeys'
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
description: OK
|
||||
"401":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Unauthorized
|
||||
"403":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Forbidden
|
||||
"500":
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: '#/components/schemas/RenderErrorResponse'
|
||||
description: Internal Server Error
|
||||
security:
|
||||
- api_key:
|
||||
- ADMIN
|
||||
- tokenizer:
|
||||
- ADMIN
|
||||
summary: Search ingestion keys for workspace
|
||||
tags:
|
||||
- gateway
|
||||
/api/v2/metric/alerts:
|
||||
get:
|
||||
deprecated: false
|
||||
description: This endpoint returns associated alerts for a specified metric
|
||||
operationId: GetMetricAlerts
|
||||
parameters:
|
||||
- in: query
|
||||
name: metricName
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
@@ -2301,6 +2444,11 @@ paths:
|
||||
deprecated: false
|
||||
description: This endpoint returns associated dashboards for a specified metric
|
||||
operationId: GetMetricDashboards
|
||||
parameters:
|
||||
- in: query
|
||||
name: metricName
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
@@ -2351,6 +2499,11 @@ paths:
|
||||
description: This endpoint returns highlights like number of datapoints, totaltimeseries,
|
||||
active time series, last received time for a specified metric
|
||||
operationId: GetMetricHighlights
|
||||
parameters:
|
||||
- in: query
|
||||
name: metricName
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
@@ -2512,6 +2665,11 @@ paths:
|
||||
description: This endpoint returns metadata information like metric description,
|
||||
unit, type, temporality, monotonicity for a specified metric
|
||||
operationId: GetMetricMetadata
|
||||
parameters:
|
||||
- in: query
|
||||
name: metricName
|
||||
schema:
|
||||
type: string
|
||||
responses:
|
||||
"200":
|
||||
content:
|
||||
@@ -2968,20 +3126,6 @@ components:
|
||||
url:
|
||||
type: string
|
||||
type: object
|
||||
AuthtypesDeprecatedGettableLogin:
|
||||
properties:
|
||||
accessJwt:
|
||||
type: string
|
||||
userId:
|
||||
type: string
|
||||
type: object
|
||||
AuthtypesDeprecatedPostableLogin:
|
||||
properties:
|
||||
email:
|
||||
type: string
|
||||
password:
|
||||
type: string
|
||||
type: object
|
||||
AuthtypesGettableAuthDomain:
|
||||
properties:
|
||||
authNProviderInfo:
|
||||
@@ -3231,6 +3375,160 @@ components:
|
||||
nullable: true
|
||||
type: object
|
||||
type: object
|
||||
GatewaytypesGettableCreatedIngestionKey:
|
||||
properties:
|
||||
id:
|
||||
type: string
|
||||
value:
|
||||
type: string
|
||||
type: object
|
||||
GatewaytypesGettableCreatedIngestionKeyLimit:
|
||||
properties:
|
||||
id:
|
||||
type: string
|
||||
type: object
|
||||
GatewaytypesGettableIngestionKeys:
|
||||
properties:
|
||||
_pagination:
|
||||
$ref: '#/components/schemas/GatewaytypesPagination'
|
||||
keys:
|
||||
items:
|
||||
$ref: '#/components/schemas/GatewaytypesIngestionKey'
|
||||
nullable: true
|
||||
type: array
|
||||
type: object
|
||||
GatewaytypesIngestionKey:
|
||||
properties:
|
||||
created_at:
|
||||
format: date-time
|
||||
type: string
|
||||
expires_at:
|
||||
format: date-time
|
||||
type: string
|
||||
id:
|
||||
type: string
|
||||
limits:
|
||||
items:
|
||||
$ref: '#/components/schemas/GatewaytypesLimit'
|
||||
nullable: true
|
||||
type: array
|
||||
name:
|
||||
type: string
|
||||
tags:
|
||||
items:
|
||||
type: string
|
||||
nullable: true
|
||||
type: array
|
||||
updated_at:
|
||||
format: date-time
|
||||
type: string
|
||||
value:
|
||||
type: string
|
||||
workspace_id:
|
||||
type: string
|
||||
type: object
|
||||
GatewaytypesLimit:
|
||||
properties:
|
||||
config:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitConfig'
|
||||
created_at:
|
||||
format: date-time
|
||||
type: string
|
||||
id:
|
||||
type: string
|
||||
key_id:
|
||||
type: string
|
||||
metric:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitMetric'
|
||||
signal:
|
||||
type: string
|
||||
tags:
|
||||
items:
|
||||
type: string
|
||||
nullable: true
|
||||
type: array
|
||||
updated_at:
|
||||
format: date-time
|
||||
type: string
|
||||
type: object
|
||||
GatewaytypesLimitConfig:
|
||||
properties:
|
||||
day:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitValue'
|
||||
second:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitValue'
|
||||
type: object
|
||||
GatewaytypesLimitMetric:
|
||||
properties:
|
||||
day:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitMetricValue'
|
||||
second:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitMetricValue'
|
||||
type: object
|
||||
GatewaytypesLimitMetricValue:
|
||||
properties:
|
||||
count:
|
||||
format: int64
|
||||
type: integer
|
||||
size:
|
||||
format: int64
|
||||
type: integer
|
||||
type: object
|
||||
GatewaytypesLimitValue:
|
||||
properties:
|
||||
count:
|
||||
format: int64
|
||||
type: integer
|
||||
size:
|
||||
format: int64
|
||||
type: integer
|
||||
type: object
|
||||
GatewaytypesPagination:
|
||||
properties:
|
||||
page:
|
||||
type: integer
|
||||
pages:
|
||||
type: integer
|
||||
per_page:
|
||||
type: integer
|
||||
total:
|
||||
type: integer
|
||||
type: object
|
||||
GatewaytypesPostableIngestionKey:
|
||||
properties:
|
||||
expires_at:
|
||||
format: date-time
|
||||
type: string
|
||||
name:
|
||||
type: string
|
||||
tags:
|
||||
items:
|
||||
type: string
|
||||
nullable: true
|
||||
type: array
|
||||
type: object
|
||||
GatewaytypesPostableIngestionKeyLimit:
|
||||
properties:
|
||||
config:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitConfig'
|
||||
signal:
|
||||
type: string
|
||||
tags:
|
||||
items:
|
||||
type: string
|
||||
nullable: true
|
||||
type: array
|
||||
type: object
|
||||
GatewaytypesUpdatableIngestionKeyLimit:
|
||||
properties:
|
||||
config:
|
||||
$ref: '#/components/schemas/GatewaytypesLimitConfig'
|
||||
tags:
|
||||
items:
|
||||
type: string
|
||||
nullable: true
|
||||
type: array
|
||||
type: object
|
||||
MetricsexplorertypesMetricAlert:
|
||||
properties:
|
||||
alertId:
|
||||
@@ -3561,65 +3859,6 @@ components:
|
||||
status:
|
||||
type: string
|
||||
type: object
|
||||
TelemetrytypesGettableFieldKeys:
|
||||
properties:
|
||||
complete:
|
||||
type: boolean
|
||||
keys:
|
||||
additionalProperties:
|
||||
items:
|
||||
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldKey'
|
||||
type: array
|
||||
nullable: true
|
||||
type: object
|
||||
type: object
|
||||
TelemetrytypesGettableFieldValues:
|
||||
properties:
|
||||
complete:
|
||||
type: boolean
|
||||
values:
|
||||
$ref: '#/components/schemas/TelemetrytypesTelemetryFieldValues'
|
||||
type: object
|
||||
TelemetrytypesMetricContext:
|
||||
properties:
|
||||
metricName:
|
||||
type: string
|
||||
type: object
|
||||
TelemetrytypesTelemetryFieldKey:
|
||||
properties:
|
||||
description:
|
||||
type: string
|
||||
fieldContext:
|
||||
type: string
|
||||
fieldDataType:
|
||||
type: string
|
||||
name:
|
||||
type: string
|
||||
signal:
|
||||
type: string
|
||||
unit:
|
||||
type: string
|
||||
type: object
|
||||
TelemetrytypesTelemetryFieldValues:
|
||||
properties:
|
||||
boolValues:
|
||||
items:
|
||||
type: boolean
|
||||
type: array
|
||||
numberValues:
|
||||
items:
|
||||
format: double
|
||||
type: number
|
||||
type: array
|
||||
relatedValues:
|
||||
items:
|
||||
type: string
|
||||
type: array
|
||||
stringValues:
|
||||
items:
|
||||
type: string
|
||||
type: array
|
||||
type: object
|
||||
TypesChangePasswordRequest:
|
||||
properties:
|
||||
newPassword:
|
||||
|
||||
292
docs/implementation/EXTERNAL_API_MONITORING.md
Normal file
292
docs/implementation/EXTERNAL_API_MONITORING.md
Normal file
@@ -0,0 +1,292 @@
|
||||
# External API Monitoring - Developer Guide
|
||||
|
||||
## Overview
|
||||
|
||||
External API Monitoring tracks outbound HTTP calls from your services to external APIs. It groups spans by domain (e.g., `api.example.com`) and displays metrics like endpoint count, request rate, error rate, latency, and last seen time.
|
||||
|
||||
**Key Requirement**: Spans must have `kind_string = 'Client'` and either `http.url`/`url.full` AND `net.peer.name`/`server.address` attributes.
|
||||
|
||||
---
|
||||
|
||||
## Architecture Flow
|
||||
|
||||
```
|
||||
Frontend (DomainList)
|
||||
→ useListOverview hook
|
||||
→ POST /api/v1/third-party-apis/overview/list
|
||||
→ getDomainList handler
|
||||
→ BuildDomainList (7 queries)
|
||||
→ QueryRange (ClickHouse)
|
||||
→ Post-processing (merge semconv, filter IPs)
|
||||
→ formatDataForTable
|
||||
→ UI Display
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Key APIs
|
||||
|
||||
### 1. Domain List API
|
||||
|
||||
**Endpoint**: `POST /api/v1/third-party-apis/overview/list`
|
||||
|
||||
**Request**:
|
||||
```json
|
||||
{
|
||||
"start": 1699123456789, // Unix timestamp (ms)
|
||||
"end": 1699127056789,
|
||||
"show_ip": false, // Filter IP addresses
|
||||
"filter": {
|
||||
"expression": "kind_string = 'Client' AND service.name = 'api'"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Response**: Table with columns:
|
||||
- `net.peer.name` (domain name)
|
||||
- `endpoints` (count_distinct with fallback: http.url or url.full)
|
||||
- `rps` (rate())
|
||||
- `error_rate` (formula: error/total_span * 100)
|
||||
- `p99` (p99(duration_nano))
|
||||
- `lastseen` (max(timestamp))
|
||||
|
||||
**Handler**: `pkg/query-service/app/http_handler.go::getDomainList()`
|
||||
|
||||
---
|
||||
|
||||
### 2. Domain Info API
|
||||
|
||||
**Endpoint**: `POST /api/v1/third-party-apis/overview/domain`
|
||||
|
||||
**Request**: Same as Domain List, but includes `domain` field
|
||||
|
||||
**Response**: Endpoint-level metrics for a specific domain
|
||||
|
||||
**Handler**: `pkg/query-service/app/http_handler.go::getDomainInfo()`
|
||||
|
||||
---
|
||||
|
||||
## Query Building
|
||||
|
||||
### Location
|
||||
`pkg/modules/thirdpartyapi/translator.go`
|
||||
|
||||
### BuildDomainList() - Creates 7 Sub-queries
|
||||
|
||||
1. **endpoints**: `count_distinct(if(http.url exists, http.url, url.full))` - Unique endpoint count (handles both semconv attributes)
|
||||
2. **lastseen**: `max(timestamp)` - Last access time
|
||||
3. **rps**: `rate()` - Requests per second
|
||||
4. **error**: `count() WHERE has_error = true` - Error count
|
||||
5. **total_span**: `count()` - Total spans (for error rate)
|
||||
6. **p99**: `p99(duration_nano)` - 99th percentile latency
|
||||
7. **error_rate**: Formula `(error/total_span)*100`
|
||||
|
||||
### Base Filter
|
||||
```go
|
||||
"(http.url EXISTS OR url.full EXISTS) AND kind_string = 'Client'"
|
||||
```
|
||||
|
||||
### GroupBy
|
||||
- Groups by `server.address` + `net.peer.name` (dual semconv support)
|
||||
|
||||
---
|
||||
|
||||
## Key Files
|
||||
|
||||
### Frontend
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `frontend/src/container/ApiMonitoring/Explorer/Domains/DomainList.tsx` | Main list view component |
|
||||
| `frontend/src/container/ApiMonitoring/Explorer/Domains/DomainDetails/DomainDetails.tsx` | Domain details drawer |
|
||||
| `frontend/src/hooks/thirdPartyApis/useListOverview.ts` | Data fetching hook |
|
||||
| `frontend/src/api/thirdPartyApis/listOverview.ts` | API client |
|
||||
| `frontend/src/container/ApiMonitoring/utils.tsx` | Utilities (formatting, query building) |
|
||||
|
||||
### Backend
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `pkg/query-service/app/http_handler.go` | API handlers (`getDomainList`, `getDomainInfo`) |
|
||||
| `pkg/modules/thirdpartyapi/translator.go` | Query builder & response processing |
|
||||
| `pkg/types/thirdpartyapitypes/thirdpartyapi.go` | Request/response types |
|
||||
|
||||
---
|
||||
|
||||
## Data Tables
|
||||
|
||||
### Primary Table
|
||||
- **Table**: `signoz_traces.distributed_signoz_index_v3`
|
||||
- **Key Columns**:
|
||||
- `kind_string` - Filter for `'Client'` spans
|
||||
- `duration_nano` - For latency calculations
|
||||
- `has_error` - For error rate
|
||||
- `timestamp` - For last seen
|
||||
- `attributes_string` - Map containing `http.url`, `net.peer.name`, etc.
|
||||
- `resources_string` - Map containing `server.address`, `service.name`, etc.
|
||||
|
||||
### Attribute Access
|
||||
```sql
|
||||
-- Check existence
|
||||
mapContains(attributes_string, 'http.url') = 1
|
||||
|
||||
-- Get value
|
||||
attributes_string['http.url']
|
||||
|
||||
-- Materialized (if exists)
|
||||
attribute_string_http$$url
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Post-Processing
|
||||
|
||||
### 1. MergeSemconvColumns()
|
||||
- Merges `server.address` and `net.peer.name` into single column
|
||||
- Location: `pkg/modules/thirdpartyapi/translator.go:117`
|
||||
|
||||
### 2. FilterIntermediateColumns()
|
||||
- Removes intermediate formula columns from response
|
||||
- Location: `pkg/modules/thirdpartyapi/translator.go:70`
|
||||
|
||||
### 3. FilterResponse()
|
||||
- Filters out IP addresses if `show_ip = false`
|
||||
- Uses `net.ParseIP()` to detect IPs
|
||||
- Location: `pkg/modules/thirdpartyapi/translator.go:214`
|
||||
|
||||
---
|
||||
|
||||
## Required Attributes
|
||||
|
||||
### For Domain Grouping
|
||||
- `net.peer.name` OR `server.address` (required)
|
||||
|
||||
### For Filtering
|
||||
- `http.url` OR `url.full` (required)
|
||||
- `kind_string = 'Client'` (required)
|
||||
|
||||
### Not Required
|
||||
- `http.target` - Not used in external API monitoring
|
||||
|
||||
### Known Bug
|
||||
The `buildEndpointsQuery()` uses `count_distinct(http.url)` but filter allows `url.full`. If spans only have `url.full`, they pass filter but don't contribute to endpoint count.
|
||||
|
||||
**Fix Needed**: Update aggregation to handle both attributes:
|
||||
```go
|
||||
// Current (buggy)
|
||||
{Expression: "count_distinct(http.url)"}
|
||||
|
||||
// Should be
|
||||
{Expression: "count_distinct(coalesce(http.url, url.full))"}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Frontend Data Flow
|
||||
|
||||
### 1. Domain List View
|
||||
```
|
||||
DomainList component
|
||||
→ useListOverview({ start, end, show_ip, filter })
|
||||
→ listOverview API call
|
||||
→ formatDataForTable(response)
|
||||
→ Table display
|
||||
```
|
||||
|
||||
### 2. Domain Details View
|
||||
```
|
||||
User clicks domain
|
||||
→ DomainDetails drawer opens
|
||||
→ Multiple queries:
|
||||
- DomainMetrics (overview cards)
|
||||
- AllEndpoints (endpoint table)
|
||||
- TopErrors (error table)
|
||||
- EndPointDetails (when endpoint selected)
|
||||
```
|
||||
|
||||
### 3. Data Formatting
|
||||
- `formatDataForTable()` - Converts API response to table format
|
||||
- Handles `n/a` values, converts nanoseconds to milliseconds
|
||||
- Maps column names to display fields
|
||||
|
||||
---
|
||||
|
||||
## Query Examples
|
||||
|
||||
### Domain List Query
|
||||
```sql
|
||||
SELECT
|
||||
multiIf(
|
||||
mapContains(attributes_string, 'server.address'),
|
||||
attributes_string['server.address'],
|
||||
mapContains(attributes_string, 'net.peer.name'),
|
||||
attributes_string['net.peer.name'],
|
||||
NULL
|
||||
) AS domain,
|
||||
count_distinct(attributes_string['http.url']) AS endpoints,
|
||||
rate() AS rps,
|
||||
p99(duration_nano) AS p99,
|
||||
max(timestamp) AS lastseen
|
||||
FROM signoz_traces.distributed_signoz_index_v3
|
||||
WHERE
|
||||
(mapContains(attributes_string, 'http.url') = 1
|
||||
OR mapContains(attributes_string, 'url.full') = 1)
|
||||
AND kind_string = 'Client'
|
||||
AND timestamp >= ? AND timestamp < ?
|
||||
GROUP BY domain
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Testing
|
||||
|
||||
### Key Test Files
|
||||
- `frontend/src/container/ApiMonitoring/__tests__/AllEndpointsWidgetV5Migration.test.tsx`
|
||||
- `frontend/src/container/ApiMonitoring/__tests__/EndpointDropdownV5Migration.test.tsx`
|
||||
- `pkg/modules/thirdpartyapi/translator_test.go`
|
||||
|
||||
### Test Scenarios
|
||||
1. Domain filtering with both semconv attributes
|
||||
2. URL handling (http.url vs url.full)
|
||||
3. IP address filtering
|
||||
4. Error rate calculation
|
||||
5. Empty state handling
|
||||
|
||||
---
|
||||
|
||||
## Common Issues
|
||||
|
||||
### Empty State
|
||||
**Symptom**: No domains shown despite data existing
|
||||
|
||||
**Causes**:
|
||||
1. Missing `net.peer.name` or `server.address`
|
||||
2. Missing `http.url` or `url.full`
|
||||
3. Spans not marked as `kind_string = 'Client'`
|
||||
4. Bug: Only `url.full` present but query uses `count_distinct(http.url)`
|
||||
|
||||
### Performance
|
||||
- Queries use `ts_bucket_start` for time partitioning
|
||||
- Resource filtering uses separate `distributed_traces_v3_resource` table
|
||||
- Materialized columns improve performance for common attributes
|
||||
|
||||
---
|
||||
|
||||
## Quick Start Checklist
|
||||
|
||||
- [ ] Understand trace table schema (`signoz_index_v3`)
|
||||
- [ ] Review `BuildDomainList()` in `translator.go`
|
||||
- [ ] Check `getDomainList()` handler in `http_handler.go`
|
||||
- [ ] Review frontend `DomainList.tsx` component
|
||||
- [ ] Understand semconv attribute mapping (legacy vs current)
|
||||
- [ ] Test with spans that have required attributes
|
||||
- [ ] Review post-processing functions (merge, filter)
|
||||
|
||||
---
|
||||
|
||||
## References
|
||||
|
||||
- **Trace Schema**: `pkg/telemetrytraces/field_mapper.go`
|
||||
- **Query Builder**: `pkg/telemetrytraces/statement_builder.go`
|
||||
- **API Routes**: `pkg/query-service/app/http_handler.go:2157`
|
||||
- **Constants**: `pkg/modules/thirdpartyapi/translator.go:14-20`
|
||||
980
docs/implementation/QUERY_RANGE_API.md
Normal file
980
docs/implementation/QUERY_RANGE_API.md
Normal file
@@ -0,0 +1,980 @@
|
||||
# Query Range API (V5) - Developer Guide
|
||||
|
||||
This document provides a comprehensive guide to the Query Range API (V5), which is the primary query endpoint for traces, logs, and metrics in SigNoz. It covers architecture, request/response models, code flows, and implementation details.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Overview](#overview)
|
||||
2. [API Endpoint](#api-endpoint)
|
||||
3. [Request/Response Models](#requestresponse-models)
|
||||
4. [Query Types](#query-types)
|
||||
5. [Request Types](#request-types)
|
||||
6. [Code Flow](#code-flow)
|
||||
7. [Key Components](#key-components)
|
||||
8. [Query Execution](#query-execution)
|
||||
9. [Caching](#caching)
|
||||
10. [Result Processing](#result-processing)
|
||||
11. [Performance Considerations](#performance-considerations)
|
||||
12. [Extending the API](#extending-the-api)
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
The Query Range API (V5) is the unified query endpoint for all telemetry signals (traces, logs, metrics) in SigNoz. It provides:
|
||||
|
||||
- **Unified Interface**: Single endpoint for all signal types
|
||||
- **Query Builder**: Visual query builder support
|
||||
- **Multiple Query Types**: Builder queries, PromQL, ClickHouse SQL, Formulas, Trace Operators
|
||||
- **Flexible Response Types**: Time series, scalar, raw data, trace-specific
|
||||
- **Advanced Features**: Aggregations, filters, group by, ordering, pagination
|
||||
- **Caching**: Intelligent caching for performance
|
||||
|
||||
### Key Technologies
|
||||
|
||||
- **Backend**: Go (Golang)
|
||||
- **Storage**: ClickHouse (columnar database)
|
||||
- **Query Language**: Custom query builder + PromQL + ClickHouse SQL
|
||||
- **Protocol**: HTTP/REST API
|
||||
|
||||
---
|
||||
|
||||
## API Endpoint
|
||||
|
||||
### Endpoint Details
|
||||
|
||||
**URL**: `POST /api/v5/query_range`
|
||||
|
||||
**Handler**: `QuerierAPI.QueryRange` → `querier.QueryRange`
|
||||
|
||||
**Location**:
|
||||
- Handler: `pkg/querier/querier.go:122`
|
||||
- Route Registration: `pkg/query-service/app/http_handler.go:480`
|
||||
|
||||
**Authentication**: Requires ViewAccess permission
|
||||
|
||||
**Content-Type**: `application/json`
|
||||
|
||||
### Request Flow
|
||||
|
||||
```
|
||||
HTTP Request (POST /api/v5/query_range)
|
||||
↓
|
||||
HTTP Handler (QuerierAPI.QueryRange)
|
||||
↓
|
||||
Querier.QueryRange (pkg/querier/querier.go)
|
||||
↓
|
||||
Query Execution (Statement Builders → ClickHouse)
|
||||
↓
|
||||
Result Processing & Merging
|
||||
↓
|
||||
HTTP Response (QueryRangeResponse)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Request/Response Models
|
||||
|
||||
### Request Model
|
||||
|
||||
**Location**: `pkg/types/querybuildertypes/querybuildertypesv5/req.go`
|
||||
|
||||
```go
|
||||
type QueryRangeRequest struct {
|
||||
Start uint64 // Start timestamp (milliseconds)
|
||||
End uint64 // End timestamp (milliseconds)
|
||||
RequestType RequestType // Response type (TimeSeries, Scalar, Raw, Trace)
|
||||
Variables map[string]VariableItem // Template variables
|
||||
CompositeQuery CompositeQuery // Container for queries
|
||||
NoCache bool // Skip cache flag
|
||||
}
|
||||
```
|
||||
|
||||
### Composite Query
|
||||
|
||||
```go
|
||||
type CompositeQuery struct {
|
||||
Queries []QueryEnvelope // Array of queries to execute
|
||||
}
|
||||
```
|
||||
|
||||
### Query Envelope
|
||||
|
||||
```go
|
||||
type QueryEnvelope struct {
|
||||
Type QueryType // Query type (Builder, PromQL, ClickHouseSQL, Formula, TraceOperator)
|
||||
Spec any // Query specification (type-specific)
|
||||
}
|
||||
```
|
||||
|
||||
### Response Model
|
||||
|
||||
**Location**: `pkg/types/querybuildertypes/querybuildertypesv5/req.go`
|
||||
|
||||
```go
|
||||
type QueryRangeResponse struct {
|
||||
Type RequestType // Response type
|
||||
Data QueryData // Query results
|
||||
Meta ExecStats // Execution statistics
|
||||
Warning *QueryWarnData // Warnings (if any)
|
||||
QBEvent *QBEvent // Query builder event metadata
|
||||
}
|
||||
|
||||
type QueryData struct {
|
||||
Results []any // Array of result objects (type depends on RequestType)
|
||||
}
|
||||
|
||||
type ExecStats struct {
|
||||
RowsScanned uint64 // Total rows scanned
|
||||
BytesScanned uint64 // Total bytes scanned
|
||||
DurationMS uint64 // Query duration in milliseconds
|
||||
StepIntervals map[string]uint64 // Step intervals per query
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Query Types
|
||||
|
||||
The API supports multiple query types, each with its own specification format.
|
||||
|
||||
### 1. Builder Query (`QueryTypeBuilder`)
|
||||
|
||||
Visual query builder queries. Supports traces, logs, and metrics.
|
||||
|
||||
**Spec Type**: `QueryBuilderQuery[T]` where T is:
|
||||
- `TraceAggregation` for traces
|
||||
- `LogAggregation` for logs
|
||||
- `MetricAggregation` for metrics
|
||||
|
||||
**Example**:
|
||||
```go
|
||||
QueryBuilderQuery[TraceAggregation] {
|
||||
Name: "query_name",
|
||||
Signal: SignalTraces,
|
||||
Filter: &Filter {
|
||||
Expression: "service.name = 'api' AND duration_nano > 1000000",
|
||||
},
|
||||
Aggregations: []TraceAggregation {
|
||||
{Expression: "count()", Alias: "total"},
|
||||
{Expression: "avg(duration_nano)", Alias: "avg_duration"},
|
||||
},
|
||||
GroupBy: []GroupByKey {...},
|
||||
Order: []OrderBy {...},
|
||||
Limit: 100,
|
||||
}
|
||||
```
|
||||
|
||||
**Key Files**:
|
||||
- Traces: `pkg/telemetrytraces/statement_builder.go`
|
||||
- Logs: `pkg/telemetrylogs/statement_builder.go`
|
||||
- Metrics: `pkg/telemetrymetrics/statement_builder.go`
|
||||
|
||||
### 2. PromQL Query (`QueryTypePromQL`)
|
||||
|
||||
Prometheus Query Language queries for metrics.
|
||||
|
||||
**Spec Type**: `PromQuery`
|
||||
|
||||
**Example**:
|
||||
```go
|
||||
PromQuery {
|
||||
Query: "rate(http_requests_total[5m])",
|
||||
Step: Step{Duration: time.Minute},
|
||||
}
|
||||
```
|
||||
|
||||
**Key Files**: `pkg/querier/promql_query.go`
|
||||
|
||||
### 3. ClickHouse SQL Query (`QueryTypeClickHouseSQL`)
|
||||
|
||||
Direct ClickHouse SQL queries.
|
||||
|
||||
**Spec Type**: `ClickHouseQuery`
|
||||
|
||||
**Example**:
|
||||
```go
|
||||
ClickHouseQuery {
|
||||
Query: "SELECT count() FROM signoz_traces.distributed_signoz_index_v3 WHERE ...",
|
||||
}
|
||||
```
|
||||
|
||||
**Key Files**: `pkg/querier/ch_sql_query.go`
|
||||
|
||||
### 4. Formula Query (`QueryTypeFormula`)
|
||||
|
||||
Mathematical formulas combining other queries.
|
||||
|
||||
**Spec Type**: `QueryBuilderFormula`
|
||||
|
||||
**Example**:
|
||||
```go
|
||||
QueryBuilderFormula {
|
||||
Expression: "A / B * 100", // A and B are query names
|
||||
}
|
||||
```
|
||||
|
||||
**Key Files**: `pkg/querier/formula_query.go`
|
||||
|
||||
### 5. Trace Operator Query (`QueryTypeTraceOperator`)
|
||||
|
||||
Set operations on trace queries (AND, OR, NOT).
|
||||
|
||||
**Spec Type**: `QueryBuilderTraceOperator`
|
||||
|
||||
**Example**:
|
||||
```go
|
||||
QueryBuilderTraceOperator {
|
||||
Expression: "A AND B", // A and B are query names
|
||||
Filter: &Filter {...},
|
||||
}
|
||||
```
|
||||
|
||||
**Key Files**:
|
||||
- `pkg/telemetrytraces/trace_operator_statement_builder.go`
|
||||
- `pkg/querier/trace_operator_query.go`
|
||||
|
||||
---
|
||||
|
||||
## Request Types
|
||||
|
||||
The `RequestType` determines the format of the response data.
|
||||
|
||||
### 1. `RequestTypeTimeSeries`
|
||||
|
||||
Returns time series data for charts.
|
||||
|
||||
**Response Format**: `TimeSeriesData`
|
||||
|
||||
```go
|
||||
type TimeSeriesData struct {
|
||||
QueryName string
|
||||
Aggregations []AggregationBucket
|
||||
}
|
||||
|
||||
type AggregationBucket struct {
|
||||
Index int
|
||||
Series []TimeSeries
|
||||
Alias string
|
||||
Meta AggregationMeta
|
||||
}
|
||||
|
||||
type TimeSeries struct {
|
||||
Labels map[string]string
|
||||
Values []TimeSeriesValue
|
||||
}
|
||||
|
||||
type TimeSeriesValue struct {
|
||||
Timestamp int64
|
||||
Value float64
|
||||
}
|
||||
```
|
||||
|
||||
**Use Case**: Line charts, bar charts, area charts
|
||||
|
||||
### 2. `RequestTypeScalar`
|
||||
|
||||
Returns a single scalar value.
|
||||
|
||||
**Response Format**: `ScalarData`
|
||||
|
||||
```go
|
||||
type ScalarData struct {
|
||||
QueryName string
|
||||
Data []ScalarValue
|
||||
}
|
||||
|
||||
type ScalarValue struct {
|
||||
Timestamp int64
|
||||
Value float64
|
||||
}
|
||||
```
|
||||
|
||||
**Use Case**: Single value displays, stat panels
|
||||
|
||||
### 3. `RequestTypeRaw`
|
||||
|
||||
Returns raw data rows.
|
||||
|
||||
**Response Format**: `RawData`
|
||||
|
||||
```go
|
||||
type RawData struct {
|
||||
QueryName string
|
||||
Columns []string
|
||||
Rows []RawDataRow
|
||||
}
|
||||
|
||||
type RawDataRow struct {
|
||||
Timestamp time.Time
|
||||
Data map[string]any
|
||||
}
|
||||
```
|
||||
|
||||
**Use Case**: Tables, logs viewer, trace lists
|
||||
|
||||
### 4. `RequestTypeTrace`
|
||||
|
||||
Returns trace-specific data structure.
|
||||
|
||||
**Response Format**: Trace-specific format (see traces documentation)
|
||||
|
||||
**Use Case**: Trace-specific visualizations
|
||||
|
||||
---
|
||||
|
||||
## Code Flow
|
||||
|
||||
### Complete Request Flow
|
||||
|
||||
```
|
||||
1. HTTP Request
|
||||
POST /api/v5/query_range
|
||||
Body: QueryRangeRequest JSON
|
||||
↓
|
||||
2. HTTP Handler
|
||||
QuerierAPI.QueryRange (pkg/querier/querier.go)
|
||||
- Validates request
|
||||
- Extracts organization ID from auth context
|
||||
↓
|
||||
3. Querier.QueryRange (pkg/querier/querier.go:122)
|
||||
- Validates QueryRangeRequest
|
||||
- Processes each query in CompositeQuery.Queries
|
||||
- Identifies dependencies (e.g., trace operators, formulas)
|
||||
- Calculates step intervals
|
||||
- Fetches metric temporality if needed
|
||||
↓
|
||||
4. Query Creation
|
||||
For each QueryEnvelope:
|
||||
|
||||
a. Builder Query:
|
||||
- newBuilderQuery() creates builderQuery instance
|
||||
- Selects appropriate statement builder based on signal:
|
||||
* Traces → traceStmtBuilder
|
||||
* Logs → logStmtBuilder
|
||||
* Metrics → metricStmtBuilder or meterStmtBuilder
|
||||
↓
|
||||
|
||||
b. PromQL Query:
|
||||
- newPromqlQuery() creates promqlQuery instance
|
||||
- Uses Prometheus engine
|
||||
↓
|
||||
|
||||
c. ClickHouse SQL Query:
|
||||
- newchSQLQuery() creates chSQLQuery instance
|
||||
- Direct SQL execution
|
||||
↓
|
||||
|
||||
d. Formula Query:
|
||||
- newFormulaQuery() creates formulaQuery instance
|
||||
- References other queries by name
|
||||
↓
|
||||
|
||||
e. Trace Operator Query:
|
||||
- newTraceOperatorQuery() creates traceOperatorQuery instance
|
||||
- Uses traceOperatorStmtBuilder
|
||||
↓
|
||||
5. Statement Building (for Builder queries)
|
||||
StatementBuilder.Build()
|
||||
- Resolves field keys from metadata store
|
||||
- Builds SQL based on request type:
|
||||
* RequestTypeRaw → buildListQuery()
|
||||
* RequestTypeTimeSeries → buildTimeSeriesQuery()
|
||||
* RequestTypeScalar → buildScalarQuery()
|
||||
* RequestTypeTrace → buildTraceQuery()
|
||||
- Returns SQL statement with arguments
|
||||
↓
|
||||
6. Query Execution
|
||||
Query.Execute()
|
||||
- Executes SQL/query against ClickHouse or Prometheus
|
||||
- Processes results into response format
|
||||
- Returns Result with data and statistics
|
||||
↓
|
||||
7. Caching (if applicable)
|
||||
- Checks bucket cache for time series queries
|
||||
- Executes queries for missing time ranges
|
||||
- Merges cached and fresh results
|
||||
↓
|
||||
8. Result Processing
|
||||
querier.run()
|
||||
- Executes all queries (with dependency resolution)
|
||||
- Collects results and warnings
|
||||
- Merges results from multiple queries
|
||||
↓
|
||||
9. Post-Processing
|
||||
postProcessResults()
|
||||
- Applies formulas if present
|
||||
- Handles variable substitution
|
||||
- Formats results for response
|
||||
↓
|
||||
10. HTTP Response
|
||||
- Returns QueryRangeResponse with results
|
||||
- Includes execution statistics
|
||||
- Includes warnings if any
|
||||
```
|
||||
|
||||
### Key Decision Points
|
||||
|
||||
1. **Query Type Selection**: Based on `QueryEnvelope.Type`
|
||||
2. **Signal Selection**: For builder queries, based on `Signal` field
|
||||
3. **Request Type Handling**: Different SQL generation for different request types
|
||||
4. **Caching Strategy**: Only for time series queries with valid fingerprints
|
||||
5. **Dependency Resolution**: Trace operators and formulas resolve dependencies first
|
||||
|
||||
---
|
||||
|
||||
## Key Components
|
||||
|
||||
### 1. Querier
|
||||
|
||||
**Location**: `pkg/querier/querier.go`
|
||||
|
||||
**Purpose**: Orchestrates query execution, caching, and result merging
|
||||
|
||||
**Key Methods**:
|
||||
- `QueryRange()`: Main entry point for query execution
|
||||
- `run()`: Executes queries and merges results
|
||||
- `executeWithCache()`: Handles caching logic
|
||||
- `mergeResults()`: Merges cached and fresh results
|
||||
- `postProcessResults()`: Applies formulas and variable substitution
|
||||
|
||||
**Key Features**:
|
||||
- Query orchestration across multiple query types
|
||||
- Intelligent caching with bucket-based strategy
|
||||
- Result merging from multiple queries
|
||||
- Formula evaluation
|
||||
- Time range optimization
|
||||
- Step interval calculation and validation
|
||||
|
||||
### 2. Statement Builder Interface
|
||||
|
||||
**Location**: `pkg/types/querybuildertypes/querybuildertypesv5/`
|
||||
|
||||
**Purpose**: Converts query builder specifications into executable queries
|
||||
|
||||
**Interface**:
|
||||
```go
|
||||
type StatementBuilder[T any] interface {
|
||||
Build(
|
||||
ctx context.Context,
|
||||
start uint64,
|
||||
end uint64,
|
||||
requestType RequestType,
|
||||
query QueryBuilderQuery[T],
|
||||
variables map[string]VariableItem,
|
||||
) (*Statement, error)
|
||||
}
|
||||
```
|
||||
|
||||
**Implementations**:
|
||||
- `traceQueryStatementBuilder` - Traces (`pkg/telemetrytraces/statement_builder.go`)
|
||||
- `logQueryStatementBuilder` - Logs (`pkg/telemetrylogs/statement_builder.go`)
|
||||
- `metricQueryStatementBuilder` - Metrics (`pkg/telemetrymetrics/statement_builder.go`)
|
||||
|
||||
**Key Features**:
|
||||
- Field resolution via metadata store
|
||||
- SQL generation for different request types
|
||||
- Filter, aggregation, group by, ordering support
|
||||
- Time range optimization
|
||||
|
||||
### 3. Query Interface
|
||||
|
||||
**Location**: `pkg/types/querybuildertypes/querybuildertypesv5/`
|
||||
|
||||
**Purpose**: Represents an executable query
|
||||
|
||||
**Interface**:
|
||||
```go
|
||||
type Query interface {
|
||||
Execute(ctx context.Context) (*Result, error)
|
||||
Fingerprint() string // For caching
|
||||
Window() (uint64, uint64) // Time range
|
||||
}
|
||||
```
|
||||
|
||||
**Implementations**:
|
||||
- `builderQuery[T]` - Builder queries (`pkg/querier/builder_query.go`)
|
||||
- `promqlQuery` - PromQL queries (`pkg/querier/promql_query.go`)
|
||||
- `chSQLQuery` - ClickHouse SQL queries (`pkg/querier/ch_sql_query.go`)
|
||||
- `formulaQuery` - Formula queries (`pkg/querier/formula_query.go`)
|
||||
- `traceOperatorQuery` - Trace operator queries (`pkg/querier/trace_operator_query.go`)
|
||||
|
||||
### 4. Telemetry Store
|
||||
|
||||
**Location**: `pkg/telemetrystore/`
|
||||
|
||||
**Purpose**: Abstraction layer for ClickHouse database access
|
||||
|
||||
**Key Methods**:
|
||||
- `Query()`: Execute SQL query
|
||||
- `QueryRow()`: Execute query returning single row
|
||||
- `Select()`: Execute query returning multiple rows
|
||||
|
||||
**Implementation**: `clickhouseTelemetryStore` (`pkg/telemetrystore/clickhousetelemetrystore/`)
|
||||
|
||||
### 5. Metadata Store
|
||||
|
||||
**Location**: `pkg/types/telemetrytypes/`
|
||||
|
||||
**Purpose**: Provides metadata about available fields, keys, and attributes
|
||||
|
||||
**Key Methods**:
|
||||
- `GetKeysMulti()`: Get field keys for multiple selectors
|
||||
- `FetchTemporalityMulti()`: Get metric temporality information
|
||||
|
||||
**Implementation**: `telemetryMetadataStore` (`pkg/telemetrymetadata/`)
|
||||
|
||||
### 6. Bucket Cache
|
||||
|
||||
**Location**: `pkg/querier/`
|
||||
|
||||
**Purpose**: Caches query results by time buckets for performance
|
||||
|
||||
**Key Methods**:
|
||||
- `GetMissRanges()`: Get time ranges not in cache
|
||||
- `Put()`: Store query result in cache
|
||||
|
||||
**Features**:
|
||||
- Bucket-based caching (aligned to step intervals)
|
||||
- Automatic cache invalidation
|
||||
- Parallel query execution for missing ranges
|
||||
|
||||
---
|
||||
|
||||
## Query Execution
|
||||
|
||||
### Builder Query Execution
|
||||
|
||||
**Location**: `pkg/querier/builder_query.go`
|
||||
|
||||
**Process**:
|
||||
1. Statement builder generates SQL
|
||||
2. SQL executed against ClickHouse via TelemetryStore
|
||||
3. Results processed based on RequestType:
|
||||
- TimeSeries: Grouped by time buckets and labels
|
||||
- Scalar: Single value extraction
|
||||
- Raw: Row-by-row processing
|
||||
4. Statistics collected (rows scanned, bytes scanned, duration)
|
||||
|
||||
### PromQL Query Execution
|
||||
|
||||
**Location**: `pkg/querier/promql_query.go`
|
||||
|
||||
**Process**:
|
||||
1. Query parsed by Prometheus engine
|
||||
2. Executed against Prometheus-compatible data
|
||||
3. Results converted to QueryRangeResponse format
|
||||
|
||||
### ClickHouse SQL Query Execution
|
||||
|
||||
**Location**: `pkg/querier/ch_sql_query.go`
|
||||
|
||||
**Process**:
|
||||
1. SQL query executed directly
|
||||
2. Results processed based on RequestType
|
||||
3. Variable substitution applied
|
||||
|
||||
### Formula Query Execution
|
||||
|
||||
**Location**: `pkg/querier/formula_query.go`
|
||||
|
||||
**Process**:
|
||||
1. Referenced queries executed first
|
||||
2. Formula expression evaluated using govaluate
|
||||
3. Results computed from query results
|
||||
|
||||
### Trace Operator Query Execution
|
||||
|
||||
**Location**: `pkg/querier/trace_operator_query.go`
|
||||
|
||||
**Process**:
|
||||
1. Expression parsed to find dependencies
|
||||
2. Referenced queries executed
|
||||
3. Set operations applied (INTERSECT, UNION, EXCEPT)
|
||||
4. Results combined
|
||||
|
||||
---
|
||||
|
||||
## Caching
|
||||
|
||||
### Caching Strategy
|
||||
|
||||
**Location**: `pkg/querier/querier.go:642`
|
||||
|
||||
**When Caching Applies**:
|
||||
- Time series queries only
|
||||
- Queries with valid fingerprints
|
||||
- `NoCache` flag not set
|
||||
|
||||
**How It Works**:
|
||||
1. Query fingerprint generated (includes query structure, filters, time range)
|
||||
2. Cache checked for existing results
|
||||
3. Missing time ranges identified
|
||||
4. Queries executed only for missing ranges (parallel execution)
|
||||
5. Fresh results merged with cached results
|
||||
6. Merged result stored in cache
|
||||
|
||||
### Cache Key Generation
|
||||
|
||||
**Location**: `pkg/querier/builder_query.go:52`
|
||||
|
||||
The fingerprint includes:
|
||||
- Signal type
|
||||
- Source type
|
||||
- Step interval
|
||||
- Aggregations
|
||||
- Filters
|
||||
- Group by fields
|
||||
- Time range (for cache key, not fingerprint)
|
||||
|
||||
### Cache Benefits
|
||||
|
||||
- **Performance**: Avoids re-executing identical queries
|
||||
- **Efficiency**: Only queries missing time ranges
|
||||
- **Parallelism**: Multiple missing ranges queried in parallel
|
||||
|
||||
---
|
||||
|
||||
## Result Processing
|
||||
|
||||
### Result Merging
|
||||
|
||||
**Location**: `pkg/querier/querier.go:795`
|
||||
|
||||
**Process**:
|
||||
1. Results from multiple queries collected
|
||||
2. For time series: Series merged by labels
|
||||
3. For raw data: Rows combined
|
||||
4. Statistics aggregated (rows scanned, bytes scanned, duration)
|
||||
|
||||
### Formula Evaluation
|
||||
|
||||
**Location**: `pkg/querier/formula_query.go`
|
||||
|
||||
**Process**:
|
||||
1. Formula expression parsed
|
||||
2. Referenced query results retrieved
|
||||
3. Expression evaluated using govaluate library
|
||||
4. Result computed and formatted
|
||||
|
||||
### Variable Substitution
|
||||
|
||||
**Location**: `pkg/querier/querier.go`
|
||||
|
||||
**Process**:
|
||||
1. Variables extracted from request
|
||||
2. Variable values substituted in queries
|
||||
3. Applied to filters, aggregations, and other query parts
|
||||
|
||||
---
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### Query Optimization
|
||||
|
||||
1. **Time Range Optimization**:
|
||||
- For trace queries with `trace_id` filter, query `trace_summary` first to narrow time range
|
||||
- Use appropriate time ranges to limit data scanned
|
||||
|
||||
2. **Step Interval Calculation**:
|
||||
- Automatic step interval calculation based on time range
|
||||
- Minimum step interval enforcement
|
||||
- Warnings for suboptimal intervals
|
||||
|
||||
3. **Index Usage**:
|
||||
- Queries use time bucket columns (`ts_bucket_start`) for efficient filtering
|
||||
- Proper filter placement for index utilization
|
||||
|
||||
4. **Limit Enforcement**:
|
||||
- Raw data queries should include limits
|
||||
- Pagination support via offset/cursor
|
||||
|
||||
### Best Practices
|
||||
|
||||
1. **Use Query Builder**: Prefer query builder over raw SQL for better optimization
|
||||
2. **Limit Time Ranges**: Always specify reasonable time ranges
|
||||
3. **Use Aggregations**: For large datasets, use aggregations instead of raw data
|
||||
4. **Cache Awareness**: Be mindful of cache TTLs when testing
|
||||
5. **Parallel Queries**: Multiple independent queries execute in parallel
|
||||
6. **Step Intervals**: Let system calculate optimal step intervals
|
||||
|
||||
### Monitoring
|
||||
|
||||
Execution statistics are included in response:
|
||||
- `RowsScanned`: Total rows scanned
|
||||
- `BytesScanned`: Total bytes scanned
|
||||
- `DurationMS`: Query execution time
|
||||
- `StepIntervals`: Step intervals per query
|
||||
|
||||
---
|
||||
|
||||
## Extending the API
|
||||
|
||||
### Adding a New Query Type
|
||||
|
||||
1. **Define Query Type** (`pkg/types/querybuildertypes/querybuildertypesv5/query.go`):
|
||||
```go
|
||||
const (
|
||||
QueryTypeMyNewType QueryType = "my_new_type"
|
||||
)
|
||||
```
|
||||
|
||||
2. **Define Query Spec**:
|
||||
```go
|
||||
type MyNewQuerySpec struct {
|
||||
Name string
|
||||
// ... your fields
|
||||
}
|
||||
```
|
||||
|
||||
3. **Update QueryEnvelope Unmarshaling** (`pkg/types/querybuildertypes/querybuildertypesv5/query.go`):
|
||||
```go
|
||||
case QueryTypeMyNewType:
|
||||
var spec MyNewQuerySpec
|
||||
if err := UnmarshalJSONWithContext(shadow.Spec, &spec, "my new query spec"); err != nil {
|
||||
return wrapUnmarshalError(err, "invalid my new query spec: %v", err)
|
||||
}
|
||||
q.Spec = spec
|
||||
```
|
||||
|
||||
4. **Implement Query Interface** (`pkg/querier/my_new_query.go`):
|
||||
```go
|
||||
type myNewQuery struct {
|
||||
spec MyNewQuerySpec
|
||||
// ... other fields
|
||||
}
|
||||
|
||||
func (q *myNewQuery) Execute(ctx context.Context) (*qbtypes.Result, error) {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
func (q *myNewQuery) Fingerprint() string {
|
||||
// Generate fingerprint for caching
|
||||
}
|
||||
|
||||
func (q *myNewQuery) Window() (uint64, uint64) {
|
||||
// Return time range
|
||||
}
|
||||
```
|
||||
|
||||
5. **Update Querier** (`pkg/querier/querier.go`):
|
||||
```go
|
||||
case QueryTypeMyNewType:
|
||||
myQuery, ok := query.Spec.(MyNewQuerySpec)
|
||||
if !ok {
|
||||
return nil, errors.NewInvalidInputf(...)
|
||||
}
|
||||
queries[myQuery.Name] = newMyNewQuery(myQuery, ...)
|
||||
```
|
||||
|
||||
### Adding a New Request Type
|
||||
|
||||
1. **Define Request Type** (`pkg/types/querybuildertypes/querybuildertypesv5/req.go`):
|
||||
```go
|
||||
const (
|
||||
RequestTypeMyNewType RequestType = "my_new_type"
|
||||
)
|
||||
```
|
||||
|
||||
2. **Update Statement Builders**: Add handling in `Build()` method
|
||||
3. **Update Query Execution**: Add result processing for new type
|
||||
4. **Update Response Models**: Add response data structure
|
||||
|
||||
### Adding a New Aggregation Function
|
||||
|
||||
1. **Update Aggregation Rewriter** (`pkg/querybuilder/agg_expr_rewriter.go`):
|
||||
```go
|
||||
func (r *aggExprRewriter) RewriteAggregation(expr string) (string, error) {
|
||||
if strings.HasPrefix(expr, "my_function(") {
|
||||
// Parse arguments
|
||||
// Return ClickHouse SQL expression
|
||||
return "myClickHouseFunction(...)", nil
|
||||
}
|
||||
// ... existing functions
|
||||
}
|
||||
```
|
||||
|
||||
2. **Update Documentation**: Document the new function
|
||||
|
||||
---
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Pattern 1: Simple Time Series Query
|
||||
|
||||
```go
|
||||
req := qbtypes.QueryRangeRequest{
|
||||
Start: startMs,
|
||||
End: endMs,
|
||||
RequestType: qbtypes.RequestTypeTimeSeries,
|
||||
CompositeQuery: qbtypes.CompositeQuery{
|
||||
Queries: []qbtypes.QueryEnvelope{
|
||||
{
|
||||
Type: qbtypes.QueryTypeBuilder,
|
||||
Spec: qbtypes.QueryBuilderQuery[qbtypes.MetricAggregation]{
|
||||
Name: "A",
|
||||
Signal: telemetrytypes.SignalMetrics,
|
||||
Aggregations: []qbtypes.MetricAggregation{
|
||||
{Expression: "sum(rate)", Alias: "total"},
|
||||
},
|
||||
StepInterval: qbtypes.Step{Duration: time.Minute},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Pattern 2: Query with Filter and Group By
|
||||
|
||||
```go
|
||||
req := qbtypes.QueryRangeRequest{
|
||||
Start: startMs,
|
||||
End: endMs,
|
||||
RequestType: qbtypes.RequestTypeTimeSeries,
|
||||
CompositeQuery: qbtypes.CompositeQuery{
|
||||
Queries: []qbtypes.QueryEnvelope{
|
||||
{
|
||||
Type: qbtypes.QueryTypeBuilder,
|
||||
Spec: qbtypes.QueryBuilderQuery[qbtypes.TraceAggregation]{
|
||||
Name: "A",
|
||||
Signal: telemetrytypes.SignalTraces,
|
||||
Filter: &qbtypes.Filter{
|
||||
Expression: "service.name = 'api' AND duration_nano > 1000000",
|
||||
},
|
||||
Aggregations: []qbtypes.TraceAggregation{
|
||||
{Expression: "count()", Alias: "total"},
|
||||
},
|
||||
GroupBy: []qbtypes.GroupByKey{
|
||||
{TelemetryFieldKey: telemetrytypes.TelemetryFieldKey{
|
||||
Name: "service.name",
|
||||
FieldContext: telemetrytypes.FieldContextResource,
|
||||
}},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Pattern 3: Formula Query
|
||||
|
||||
```go
|
||||
req := qbtypes.QueryRangeRequest{
|
||||
Start: startMs,
|
||||
End: endMs,
|
||||
RequestType: qbtypes.RequestTypeTimeSeries,
|
||||
CompositeQuery: qbtypes.CompositeQuery{
|
||||
Queries: []qbtypes.QueryEnvelope{
|
||||
{
|
||||
Type: qbtypes.QueryTypeBuilder,
|
||||
Spec: qbtypes.QueryBuilderQuery[qbtypes.MetricAggregation]{
|
||||
Name: "A",
|
||||
// ... query A definition
|
||||
},
|
||||
},
|
||||
{
|
||||
Type: qbtypes.QueryTypeBuilder,
|
||||
Spec: qbtypes.QueryBuilderQuery[qbtypes.MetricAggregation]{
|
||||
Name: "B",
|
||||
// ... query B definition
|
||||
},
|
||||
},
|
||||
{
|
||||
Type: qbtypes.QueryTypeFormula,
|
||||
Spec: qbtypes.QueryBuilderFormula{
|
||||
Name: "C",
|
||||
Expression: "A / B * 100",
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Testing
|
||||
|
||||
### Unit Tests
|
||||
|
||||
- `pkg/querier/querier_test.go` - Querier tests
|
||||
- `pkg/querier/builder_query_test.go` - Builder query tests
|
||||
- `pkg/querier/formula_query_test.go` - Formula query tests
|
||||
|
||||
### Integration Tests
|
||||
|
||||
- `tests/integration/` - End-to-end API tests
|
||||
|
||||
### Running Tests
|
||||
|
||||
```bash
|
||||
# Run all querier tests
|
||||
go test ./pkg/querier/...
|
||||
|
||||
# Run with verbose output
|
||||
go test -v ./pkg/querier/...
|
||||
|
||||
# Run specific test
|
||||
go test -v ./pkg/querier/ -run TestQueryRange
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Debugging
|
||||
|
||||
### Enable Debug Logging
|
||||
|
||||
```go
|
||||
// In querier.go
|
||||
q.logger.DebugContext(ctx, "Executing query",
|
||||
"query", queryName,
|
||||
"start", start,
|
||||
"end", end)
|
||||
```
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Query Not Found**: Check query name matches in CompositeQuery
|
||||
2. **SQL Errors**: Check generated SQL in logs, verify ClickHouse syntax
|
||||
3. **Performance**: Check execution statistics, optimize time ranges
|
||||
4. **Cache Issues**: Set `NoCache: true` to bypass cache
|
||||
5. **Formula Errors**: Check formula expression syntax and referenced query names
|
||||
|
||||
---
|
||||
|
||||
## References
|
||||
|
||||
### Key Files
|
||||
|
||||
- `pkg/querier/querier.go` - Main query orchestration
|
||||
- `pkg/querier/builder_query.go` - Builder query execution
|
||||
- `pkg/types/querybuildertypes/querybuildertypesv5/` - Request/response models
|
||||
- `pkg/telemetrystore/` - ClickHouse interface
|
||||
- `pkg/telemetrymetadata/` - Metadata store
|
||||
|
||||
### Signal-Specific Documentation
|
||||
|
||||
- [Traces Module](./TRACES_MODULE.md) - Trace-specific details
|
||||
- Logs module documentation (when available)
|
||||
- Metrics module documentation (when available)
|
||||
|
||||
### Related Documentation
|
||||
|
||||
- [ClickHouse Documentation](https://clickhouse.com/docs)
|
||||
- [PromQL Documentation](https://prometheus.io/docs/prometheus/latest/querying/basics/)
|
||||
|
||||
---
|
||||
|
||||
## Contributing
|
||||
|
||||
When contributing to the Query Range API:
|
||||
|
||||
1. **Follow Existing Patterns**: Match the style of existing query types
|
||||
2. **Add Tests**: Include unit tests for new functionality
|
||||
3. **Update Documentation**: Update this doc for significant changes
|
||||
4. **Consider Performance**: Optimize queries and use caching appropriately
|
||||
5. **Handle Errors**: Provide meaningful error messages
|
||||
|
||||
For questions or help, reach out to the maintainers or open an issue.
|
||||
185
docs/implementation/SPAN_METRICS_PROCESSOR.md
Normal file
185
docs/implementation/SPAN_METRICS_PROCESSOR.md
Normal file
@@ -0,0 +1,185 @@
|
||||
# SigNoz Span Metrics Processor
|
||||
|
||||
The `signozspanmetricsprocessor` is an OpenTelemetry Collector processor that intercepts trace data to generate RED metrics (Rate, Errors, Duration) from spans.
|
||||
|
||||
**Location:** `signoz-otel-collector/processor/signozspanmetricsprocessor/`
|
||||
|
||||
## Trace Interception
|
||||
|
||||
The processor implements `consumer.Traces` interface and sits in the traces pipeline:
|
||||
|
||||
```go
|
||||
func (p *processorImp) ConsumeTraces(ctx context.Context, traces ptrace.Traces) error {
|
||||
p.lock.Lock()
|
||||
p.aggregateMetrics(traces)
|
||||
p.lock.Unlock()
|
||||
|
||||
return p.tracesConsumer.ConsumeTraces(ctx, traces) // forward unchanged
|
||||
}
|
||||
```
|
||||
|
||||
All traces flow through this method. Metrics are aggregated, then traces are forwarded unmodified to the next consumer.
|
||||
|
||||
## Metrics Generated
|
||||
|
||||
| Metric | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| `signoz_latency` | Histogram | Span latency by service/operation/kind/status |
|
||||
| `signoz_calls_total` | Counter | Call count per service/operation/kind/status |
|
||||
| `signoz_db_latency_sum/count` | Counter | DB call latency (spans with `db.system` attribute) |
|
||||
| `signoz_external_call_latency_sum/count` | Counter | External call latency (client spans with remote address) |
|
||||
|
||||
### Dimensions
|
||||
|
||||
All metrics include these base dimensions:
|
||||
- `service.name` - from resource attributes
|
||||
- `operation` - span name
|
||||
- `span.kind` - SPAN_KIND_SERVER, SPAN_KIND_CLIENT, etc.
|
||||
- `status.code` - STATUS_CODE_OK, STATUS_CODE_ERROR, etc.
|
||||
|
||||
Additional dimensions can be configured.
|
||||
|
||||
## Aggregation Flow
|
||||
|
||||
```
|
||||
traces pipeline
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────────────────┐
|
||||
│ ConsumeTraces() │
|
||||
│ │ │
|
||||
│ ▼ │
|
||||
│ aggregateMetrics(traces) │
|
||||
│ │ │
|
||||
│ ├── for each ResourceSpan │
|
||||
│ │ extract service.name │
|
||||
│ │ │ │
|
||||
│ │ ├── for each Span │
|
||||
│ │ │ │ │
|
||||
│ │ │ ▼ │
|
||||
│ │ │ aggregateMetricsForSpan() │
|
||||
│ │ │ ├── skip stale spans (>24h) │
|
||||
│ │ │ ├── skip excluded patterns │
|
||||
│ │ │ ├── calculate latency │
|
||||
│ │ │ ├── build metric key │
|
||||
│ │ │ ├── update histograms │
|
||||
│ │ │ └── cache dimensions │
|
||||
│ │ │ │
|
||||
│ ▼ │
|
||||
│ forward traces to next consumer │
|
||||
└─────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Periodic Export
|
||||
|
||||
A background goroutine exports aggregated metrics on a ticker interval:
|
||||
|
||||
```go
|
||||
go func() {
|
||||
for {
|
||||
select {
|
||||
case <-p.ticker.C:
|
||||
p.exportMetrics(ctx) // build and send to metrics exporter
|
||||
}
|
||||
}
|
||||
}()
|
||||
```
|
||||
|
||||
## Key Design Features
|
||||
|
||||
### 1. Time Bucketing (Delta Temporality)
|
||||
|
||||
For delta temporality, metric keys include a time bucket prefix:
|
||||
|
||||
```go
|
||||
if p.config.GetAggregationTemporality() == pmetric.AggregationTemporalityDelta {
|
||||
p.AddTimeToKeyBuf(span.StartTimestamp().AsTime()) // truncated to interval
|
||||
}
|
||||
```
|
||||
|
||||
- Spans are grouped by time bucket (default: 1 minute)
|
||||
- After export, buckets are reset
|
||||
- Memory-efficient for high-cardinality data
|
||||
|
||||
### 2. LRU Dimension Caching
|
||||
|
||||
Dimension key-value maps are cached to avoid rebuilding:
|
||||
|
||||
```go
|
||||
if _, has := p.metricKeyToDimensions.Get(k); !has {
|
||||
p.metricKeyToDimensions.Add(k, p.buildDimensionKVs(...))
|
||||
}
|
||||
```
|
||||
|
||||
- Configurable cache size (`DimensionsCacheSize`)
|
||||
- Evicted keys also removed from histograms
|
||||
|
||||
### 3. Cardinality Protection
|
||||
|
||||
Prevents memory explosion from high cardinality:
|
||||
|
||||
```go
|
||||
if len(p.serviceToOperations) > p.maxNumberOfServicesToTrack {
|
||||
serviceName = "overflow_service"
|
||||
}
|
||||
if len(p.serviceToOperations[serviceName]) > p.maxNumberOfOperationsToTrackPerService {
|
||||
spanName = "overflow_operation"
|
||||
}
|
||||
```
|
||||
|
||||
Excess services/operations are aggregated into overflow buckets.
|
||||
|
||||
### 4. Exemplars
|
||||
|
||||
Trace/span IDs attached to histogram samples for metric-to-trace correlation:
|
||||
|
||||
```go
|
||||
histo.exemplarsData = append(histo.exemplarsData, exemplarData{
|
||||
traceID: traceID,
|
||||
spanID: spanID,
|
||||
value: latency,
|
||||
})
|
||||
```
|
||||
|
||||
Enables "show me a trace that caused this latency spike" in UI.
|
||||
|
||||
## Configuration Options
|
||||
|
||||
| Option | Description | Default |
|
||||
|--------|-------------|---------|
|
||||
| `metrics_exporter` | Target exporter for generated metrics | required |
|
||||
| `latency_histogram_buckets` | Custom histogram bucket boundaries | 2,4,6,8,10,50,100,200,400,800,1000,1400,2000,5000,10000,15000 ms |
|
||||
| `dimensions` | Additional span/resource attributes to include | [] |
|
||||
| `dimensions_cache_size` | LRU cache size for dimension maps | 1000 |
|
||||
| `aggregation_temporality` | cumulative or delta | cumulative |
|
||||
| `time_bucket_interval` | Bucket interval for delta temporality | 1m |
|
||||
| `skip_spans_older_than` | Skip stale spans | 24h |
|
||||
| `max_services_to_track` | Cardinality limit for services | - |
|
||||
| `max_operations_to_track_per_service` | Cardinality limit for operations | - |
|
||||
| `exclude_patterns` | Regex patterns to skip spans | [] |
|
||||
|
||||
## Pipeline Configuration Example
|
||||
|
||||
```yaml
|
||||
processors:
|
||||
signozspanmetrics:
|
||||
metrics_exporter: clickhousemetricswrite
|
||||
latency_histogram_buckets: [2ms, 4ms, 6ms, 8ms, 10ms, 50ms, 100ms, 200ms]
|
||||
dimensions:
|
||||
- name: http.method
|
||||
- name: http.status_code
|
||||
dimensions_cache_size: 10000
|
||||
aggregation_temporality: delta
|
||||
|
||||
pipelines:
|
||||
traces:
|
||||
receivers: [otlp]
|
||||
processors: [signozspanmetrics, batch]
|
||||
exporters: [clickhousetraces]
|
||||
|
||||
metrics:
|
||||
receivers: [otlp]
|
||||
exporters: [clickhousemetricswrite]
|
||||
```
|
||||
|
||||
The processor sits in the traces pipeline but exports to a metrics pipeline exporter.
|
||||
832
docs/implementation/TRACES_MODULE.md
Normal file
832
docs/implementation/TRACES_MODULE.md
Normal file
@@ -0,0 +1,832 @@
|
||||
# SigNoz Traces Module - Developer Guide
|
||||
|
||||
This document provides a comprehensive guide to understanding and contributing to the traces module in SigNoz. It covers architecture, APIs, code flows, and implementation details.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
1. [Overview](#overview)
|
||||
2. [Architecture](#architecture)
|
||||
3. [Data Models](#data-models)
|
||||
4. [API Endpoints](#api-endpoints)
|
||||
5. [Code Flows](#code-flows)
|
||||
6. [Key Components](#key-components)
|
||||
7. [Query Building System](#query-building-system)
|
||||
8. [Storage Schema](#storage-schema)
|
||||
9. [Extending the Traces Module](#extending-the-traces-module)
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
The traces module in SigNoz handles distributed tracing data from OpenTelemetry. It provides:
|
||||
|
||||
- **Ingestion**: Receives traces via OpenTelemetry Collector
|
||||
- **Storage**: Stores traces in ClickHouse
|
||||
- **Querying**: Supports complex queries with filters, aggregations, and trace operators
|
||||
- **Visualization**: Provides waterfall and flamegraph views
|
||||
- **Trace Funnels**: Advanced analytics for multi-step trace analysis
|
||||
|
||||
### Key Technologies
|
||||
|
||||
- **Backend**: Go (Golang)
|
||||
- **Storage**: ClickHouse (columnar database)
|
||||
- **Protocol**: OpenTelemetry Protocol (OTLP)
|
||||
- **Query Language**: Custom query builder + ClickHouse SQL
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
### High-Level Flow
|
||||
|
||||
```
|
||||
Application → OpenTelemetry SDK → OTLP Receiver →
|
||||
[Processors: signozspanmetrics, batch] →
|
||||
ClickHouse Traces Exporter → ClickHouse Database
|
||||
↓
|
||||
Query Service (Go)
|
||||
↓
|
||||
Frontend (React/TypeScript)
|
||||
```
|
||||
|
||||
### Component Architecture
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────┐
|
||||
│ Frontend (React) │
|
||||
│ - TracesExplorer │
|
||||
│ - TraceDetail (Waterfall/Flamegraph) │
|
||||
│ - Query Builder UI │
|
||||
└────────────────────┬────────────────────────────────────┘
|
||||
│ HTTP/REST API
|
||||
┌────────────────────▼────────────────────────────────────┐
|
||||
│ Query Service (Go) │
|
||||
│ ┌──────────────────────────────────────────────────┐ │
|
||||
│ │ HTTP Handlers (http_handler.go) │ │
|
||||
│ │ - QueryRangeV5 (Main query endpoint) │ │
|
||||
│ │ - GetWaterfallSpansForTrace │ │
|
||||
│ │ - GetFlamegraphSpansForTrace │ │
|
||||
│ │ - Trace Fields API │ │
|
||||
│ └──────────────────────────────────────────────────┘ │
|
||||
│ ┌──────────────────────────────────────────────────┐ │
|
||||
│ │ Querier (querier.go) │ │
|
||||
│ │ - Query orchestration │ │
|
||||
│ │ - Cache management │ │
|
||||
│ │ - Result merging │ │
|
||||
│ └──────────────────────────────────────────────────┘ │
|
||||
│ ┌──────────────────────────────────────────────────┐ │
|
||||
│ │ Statement Builders │ │
|
||||
│ │ - traceQueryStatementBuilder │ │
|
||||
│ │ - traceOperatorStatementBuilder │ │
|
||||
│ │ - Builds ClickHouse SQL from query specs │ │
|
||||
│ └──────────────────────────────────────────────────┘ │
|
||||
│ ┌──────────────────────────────────────────────────┐ │
|
||||
│ │ ClickHouse Reader (clickhouseReader/) │ │
|
||||
│ │ - Direct trace retrieval │ │
|
||||
│ │ - Waterfall/Flamegraph data processing │ │
|
||||
│ └──────────────────────────────────────────────────┘ │
|
||||
└────────────────────┬────────────────────────────────────┘
|
||||
│ ClickHouse Protocol
|
||||
┌────────────────────▼────────────────────────────────────┐
|
||||
│ ClickHouse Database │
|
||||
│ - signoz_traces.distributed_signoz_index_v3 │
|
||||
│ - signoz_traces.distributed_trace_summary │
|
||||
│ - signoz_traces.distributed_tag_attributes_v2 │
|
||||
└──────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Data Models
|
||||
|
||||
### Core Trace Models
|
||||
|
||||
**Location**: `pkg/query-service/model/trace.go`
|
||||
|
||||
### Query Request Models
|
||||
|
||||
**Location**: `pkg/types/querybuildertypes/querybuildertypesv5/`
|
||||
|
||||
- `QueryRangeRequest`: Main query request structure
|
||||
- `QueryBuilderQuery[TraceAggregation]`: Query builder specification for traces
|
||||
- `QueryBuilderTraceOperator`: Trace operator query specification
|
||||
- `CompositeQuery`: Container for multiple queries
|
||||
|
||||
---
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### 1. Query Range API (V5) - Primary Query Endpoint
|
||||
|
||||
**Endpoint**: `POST /api/v5/query_range`
|
||||
|
||||
**Handler**: `QuerierAPI.QueryRange` → `querier.QueryRange`
|
||||
|
||||
**Purpose**: Main query endpoint for traces, logs, and metrics. Supports:
|
||||
- Query builder queries
|
||||
- Trace operator queries
|
||||
- Aggregations, filters, group by
|
||||
- Time series, scalar, and raw data requests
|
||||
|
||||
> **Note**: For detailed information about the Query Range API, including request/response models, query types, and common code flows, see the [Query Range API Documentation](./QUERY_RANGE_API.md).
|
||||
|
||||
**Trace-Specific Details**:
|
||||
- Uses `traceQueryStatementBuilder` for SQL generation
|
||||
- Supports trace-specific aggregations (count, avg, p99, etc. on duration_nano)
|
||||
- Trace operator queries combine multiple trace queries with set operations
|
||||
- Time range optimization when `trace_id` filter is present
|
||||
|
||||
**Key Files**:
|
||||
- `pkg/telemetrytraces/statement_builder.go` - Trace SQL generation
|
||||
- `pkg/telemetrytraces/trace_operator_statement_builder.go` - Trace operator SQL
|
||||
- `pkg/querier/trace_operator_query.go` - Trace operator execution
|
||||
|
||||
### 2. Waterfall View API
|
||||
|
||||
**Endpoint**: `POST /api/v2/traces/waterfall/{traceId}`
|
||||
|
||||
**Handler**: `GetWaterfallSpansForTraceWithMetadata`
|
||||
|
||||
**Purpose**: Retrieves spans for waterfall visualization with metadata
|
||||
|
||||
**Request Parameters**:
|
||||
```go
|
||||
type GetWaterfallSpansForTraceWithMetadataParams struct {
|
||||
SelectedSpanID string // Selected span to focus on
|
||||
IsSelectedSpanIDUnCollapsed bool // Whether selected span is expanded
|
||||
UncollapsedSpans []string // List of expanded span IDs
|
||||
}
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```go
|
||||
type GetWaterfallSpansForTraceWithMetadataResponse struct {
|
||||
StartTimestampMillis uint64 // Trace start time
|
||||
EndTimestampMillis uint64 // Trace end time
|
||||
DurationNano uint64 // Total duration
|
||||
RootServiceName string // Root service
|
||||
RootServiceEntryPoint string // Entry point operation
|
||||
TotalSpansCount uint64 // Total spans
|
||||
TotalErrorSpansCount uint64 // Error spans
|
||||
ServiceNameToTotalDurationMap map[string]uint64 // Service durations
|
||||
Spans []*Span // Span tree
|
||||
HasMissingSpans bool // Missing spans indicator
|
||||
UncollapsedSpans []string // Expanded spans
|
||||
}
|
||||
```
|
||||
|
||||
**Code Flow**:
|
||||
```
|
||||
Handler → ClickHouseReader.GetWaterfallSpansForTraceWithMetadata
|
||||
→ Query trace_summary for time range
|
||||
→ Query spans from signoz_index_v3
|
||||
→ Build span tree structure
|
||||
→ Apply uncollapsed/selected span logic
|
||||
→ Return filtered spans (500 span limit)
|
||||
```
|
||||
|
||||
**Key Files**:
|
||||
- `pkg/query-service/app/http_handler.go:1748` - Handler
|
||||
- `pkg/query-service/app/clickhouseReader/reader.go:873` - Implementation
|
||||
- `pkg/query-service/app/traces/tracedetail/waterfall.go` - Tree processing
|
||||
|
||||
### 3. Flamegraph View API
|
||||
|
||||
**Endpoint**: `POST /api/v2/traces/flamegraph/{traceId}`
|
||||
|
||||
**Handler**: `GetFlamegraphSpansForTrace`
|
||||
|
||||
**Purpose**: Retrieves spans organized by level for flamegraph visualization
|
||||
|
||||
**Request Parameters**:
|
||||
```go
|
||||
type GetFlamegraphSpansForTraceParams struct {
|
||||
SelectedSpanID string // Selected span ID
|
||||
}
|
||||
```
|
||||
|
||||
**Response**:
|
||||
```go
|
||||
type GetFlamegraphSpansForTraceResponse struct {
|
||||
StartTimestampMillis uint64 // Trace start
|
||||
EndTimestampMillis uint64 // Trace end
|
||||
DurationNano uint64 // Total duration
|
||||
Spans [][]*FlamegraphSpan // Spans organized by level
|
||||
}
|
||||
```
|
||||
|
||||
**Code Flow**:
|
||||
```
|
||||
Handler → ClickHouseReader.GetFlamegraphSpansForTrace
|
||||
→ Query trace_summary for time range
|
||||
→ Query spans from signoz_index_v3
|
||||
→ Build span tree
|
||||
→ BFS traversal to organize by level
|
||||
→ Sample spans (50 levels, 100 spans/level max)
|
||||
→ Return level-organized spans
|
||||
```
|
||||
|
||||
**Key Files**:
|
||||
- `pkg/query-service/app/http_handler.go:1781` - Handler
|
||||
- `pkg/query-service/app/clickhouseReader/reader.go:1091` - Implementation
|
||||
- `pkg/query-service/app/traces/tracedetail/flamegraph.go` - BFS processing
|
||||
|
||||
### 4. Trace Fields API
|
||||
|
||||
**Endpoint**:
|
||||
- `GET /api/v2/traces/fields` - Get available trace fields
|
||||
- `POST /api/v2/traces/fields` - Update trace field metadata
|
||||
|
||||
**Handler**: `traceFields`, `updateTraceField`
|
||||
|
||||
**Purpose**: Manage trace field metadata for query builder
|
||||
|
||||
**Key Files**:
|
||||
- `pkg/query-service/app/http_handler.go:4912` - Get handler
|
||||
- `pkg/query-service/app/http_handler.go:4921` - Update handler
|
||||
|
||||
### 5. Trace Funnels API
|
||||
|
||||
**Endpoint**: `/api/v1/trace-funnels/*`
|
||||
|
||||
**Purpose**: Manage trace funnels (multi-step trace analysis)
|
||||
|
||||
**Endpoints**:
|
||||
- `POST /api/v1/trace-funnels/new` - Create funnel
|
||||
- `GET /api/v1/trace-funnels/list` - List funnels
|
||||
- `GET /api/v1/trace-funnels/{funnel_id}` - Get funnel
|
||||
- `PUT /api/v1/trace-funnels/{funnel_id}` - Update funnel
|
||||
- `DELETE /api/v1/trace-funnels/{funnel_id}` - Delete funnel
|
||||
- `POST /api/v1/trace-funnels/{funnel_id}/analytics/*` - Analytics endpoints
|
||||
|
||||
**Key Files**:
|
||||
- `pkg/query-service/app/http_handler.go:5084` - Route registration
|
||||
- `pkg/modules/tracefunnel/` - Funnel implementation
|
||||
|
||||
---
|
||||
|
||||
## Code Flows
|
||||
|
||||
### Flow 1: Query Range Request (V5)
|
||||
|
||||
This is the primary query flow for traces. For the complete flow covering all query types, see the [Query Range API Documentation](./QUERY_RANGE_API.md#code-flow).
|
||||
|
||||
**Trace-Specific Flow**:
|
||||
|
||||
```
|
||||
1. HTTP Request
|
||||
POST /api/v5/query_range
|
||||
↓
|
||||
2. Querier.QueryRange (common flow - see QUERY_RANGE_API.md)
|
||||
↓
|
||||
3. Trace Query Processing:
|
||||
a. Builder Query (QueryTypeBuilder with SignalTraces):
|
||||
- newBuilderQuery() creates builderQuery instance
|
||||
- Uses traceStmtBuilder (traceQueryStatementBuilder)
|
||||
↓
|
||||
b. Trace Operator Query (QueryTypeTraceOperator):
|
||||
- newTraceOperatorQuery() creates traceOperatorQuery
|
||||
- Uses traceOperatorStmtBuilder
|
||||
↓
|
||||
4. Trace Statement Building
|
||||
traceQueryStatementBuilder.Build() (pkg/telemetrytraces/statement_builder.go:58)
|
||||
- Resolves trace field keys from metadata store
|
||||
- Optimizes time range if trace_id filter present (queries trace_summary)
|
||||
- Maps fields using traceFieldMapper
|
||||
- Builds conditions using traceConditionBuilder
|
||||
- Builds SQL based on request type:
|
||||
* RequestTypeRaw → buildListQuery()
|
||||
* RequestTypeTimeSeries → buildTimeSeriesQuery()
|
||||
* RequestTypeScalar → buildScalarQuery()
|
||||
* RequestTypeTrace → buildTraceQuery()
|
||||
↓
|
||||
5. Query Execution
|
||||
builderQuery.Execute() (pkg/querier/builder_query.go)
|
||||
- Executes SQL against ClickHouse (signoz_traces database)
|
||||
- Processes results into response format
|
||||
↓
|
||||
6. Result Processing (common flow - see QUERY_RANGE_API.md)
|
||||
- Merges results from multiple queries
|
||||
- Applies formulas if present
|
||||
- Handles caching
|
||||
↓
|
||||
7. HTTP Response
|
||||
- Returns QueryRangeResponse with trace results
|
||||
```
|
||||
|
||||
**Trace-Specific Key Components**:
|
||||
- `pkg/telemetrytraces/statement_builder.go` - Trace SQL generation
|
||||
- `pkg/telemetrytraces/field_mapper.go` - Trace field mapping
|
||||
- `pkg/telemetrytraces/condition_builder.go` - Trace filter building
|
||||
- `pkg/telemetrytraces/trace_operator_statement_builder.go` - Trace operator SQL
|
||||
|
||||
### Flow 2: Waterfall View Request
|
||||
|
||||
```
|
||||
1. HTTP Request
|
||||
POST /api/v2/traces/waterfall/{traceId}
|
||||
↓
|
||||
2. GetWaterfallSpansForTraceWithMetadata handler
|
||||
- Extracts traceId from URL
|
||||
- Parses request body for params
|
||||
↓
|
||||
3. ClickHouseReader.GetWaterfallSpansForTraceWithMetadata
|
||||
- Checks cache first (5 minute TTL)
|
||||
↓
|
||||
4. If cache miss:
|
||||
a. Query trace_summary table
|
||||
SELECT * FROM distributed_trace_summary WHERE trace_id = ?
|
||||
- Gets time range (start, end, num_spans)
|
||||
↓
|
||||
b. Query spans table
|
||||
SELECT ... FROM distributed_signoz_index_v3
|
||||
WHERE trace_id = ?
|
||||
AND ts_bucket_start >= ? AND ts_bucket_start <= ?
|
||||
- Retrieves all spans for trace
|
||||
↓
|
||||
c. Build span tree
|
||||
- Parse references to build parent-child relationships
|
||||
- Identify root spans (no parent)
|
||||
- Calculate service durations
|
||||
↓
|
||||
d. Cache result
|
||||
↓
|
||||
5. Apply selection logic
|
||||
tracedetail.GetSelectedSpans()
|
||||
- Traverses tree based on uncollapsed spans
|
||||
- Finds path to selected span
|
||||
- Returns sliding window (500 spans max)
|
||||
↓
|
||||
6. HTTP Response
|
||||
- Returns spans with metadata
|
||||
```
|
||||
|
||||
**Key Components**:
|
||||
- `pkg/query-service/app/clickhouseReader/reader.go:873`
|
||||
- `pkg/query-service/app/traces/tracedetail/waterfall.go`
|
||||
- `pkg/query-service/model/trace.go`
|
||||
|
||||
### Flow 3: Trace Operator Query
|
||||
|
||||
Trace operators allow combining multiple trace queries with set operations.
|
||||
|
||||
```
|
||||
1. QueryRangeRequest with QueryTypeTraceOperator
|
||||
↓
|
||||
2. Querier identifies trace operator queries
|
||||
- Parses expression to find dependencies
|
||||
- Collects referenced queries
|
||||
↓
|
||||
3. traceOperatorStatementBuilder.Build()
|
||||
- Parses expression (e.g., "A AND B", "A OR B")
|
||||
- Builds expression tree
|
||||
↓
|
||||
4. traceOperatorCTEBuilder.build()
|
||||
- Creates CTEs (Common Table Expressions) for each query
|
||||
- Builds final query with set operations:
|
||||
* AND → INTERSECT
|
||||
* OR → UNION
|
||||
* NOT → EXCEPT
|
||||
↓
|
||||
5. Execute combined query
|
||||
- Returns traces matching the operator expression
|
||||
```
|
||||
|
||||
**Key Components**:
|
||||
- `pkg/telemetrytraces/trace_operator_statement_builder.go`
|
||||
- `pkg/telemetrytraces/trace_operator_cte_builder.go`
|
||||
- `pkg/querier/trace_operator_query.go`
|
||||
|
||||
---
|
||||
|
||||
## Key Components
|
||||
|
||||
> **Note**: For common components used across all signals (Querier, TelemetryStore, MetadataStore, etc.), see the [Query Range API Documentation](./QUERY_RANGE_API.md#key-components).
|
||||
|
||||
### 1. Trace Statement Builder
|
||||
|
||||
**Location**: `pkg/telemetrytraces/statement_builder.go`
|
||||
|
||||
**Purpose**: Converts trace query builder specifications into ClickHouse SQL
|
||||
|
||||
**Key Methods**:
|
||||
- `Build()`: Main entry point, builds SQL statement
|
||||
- `buildListQuery()`: Builds query for raw/list results
|
||||
- `buildTimeSeriesQuery()`: Builds query for time series
|
||||
- `buildScalarQuery()`: Builds query for scalar values
|
||||
- `buildTraceQuery()`: Builds query for trace-specific results
|
||||
|
||||
**Key Features**:
|
||||
- Trace field resolution via metadata store
|
||||
- Time range optimization for trace_id filters (queries trace_summary first)
|
||||
- Support for trace aggregations, filters, group by, ordering
|
||||
- Calculated field support (http_method, db_name, has_error, etc.)
|
||||
- Resource filter support via resourceFilterStmtBuilder
|
||||
|
||||
### 2. Trace Field Mapper
|
||||
|
||||
**Location**: `pkg/telemetrytraces/field_mapper.go`
|
||||
|
||||
**Purpose**: Maps trace query field names to ClickHouse column names
|
||||
|
||||
**Field Types**:
|
||||
- **Intrinsic Fields**: Built-in fields (trace_id, span_id, duration_nano, name, kind_string, status_code_string, etc.)
|
||||
- **Calculated Fields**: Derived fields (http_method, db_name, has_error, response_status_code, etc.)
|
||||
- **Attribute Fields**: Dynamic span/resource attributes (accessed via attributes_string, attributes_number, attributes_bool, resources_string)
|
||||
|
||||
**Example Mapping**:
|
||||
```
|
||||
"service.name" → "resource_string_service$$name"
|
||||
"http.method" → Calculated from attributes_string['http.method']
|
||||
"duration_nano" → "duration_nano" (intrinsic)
|
||||
"trace_id" → "trace_id" (intrinsic)
|
||||
```
|
||||
|
||||
**Key Methods**:
|
||||
- `MapField()`: Maps a field to ClickHouse expression
|
||||
- `MapAttribute()`: Maps attribute fields
|
||||
- `MapResource()`: Maps resource fields
|
||||
|
||||
### 3. Trace Condition Builder
|
||||
|
||||
**Location**: `pkg/telemetrytraces/condition_builder.go`
|
||||
|
||||
**Purpose**: Builds WHERE clause conditions from trace filter expressions
|
||||
|
||||
**Supported Operators**:
|
||||
- `=`, `!=`, `IN`, `NOT IN`
|
||||
- `>`, `>=`, `<`, `<=`
|
||||
- `LIKE`, `NOT LIKE`, `ILIKE`
|
||||
- `EXISTS`, `NOT EXISTS`
|
||||
- `CONTAINS`, `NOT CONTAINS`
|
||||
|
||||
**Key Methods**:
|
||||
- `BuildCondition()`: Builds condition from filter expression
|
||||
- Handles attribute, resource, and intrinsic field filtering
|
||||
|
||||
### 4. Trace Operator Statement Builder
|
||||
|
||||
**Location**: `pkg/telemetrytraces/trace_operator_statement_builder.go`
|
||||
|
||||
**Purpose**: Builds SQL for trace operator queries (AND, OR, NOT operations on trace queries)
|
||||
|
||||
**Key Methods**:
|
||||
- `Build()`: Builds CTE-based SQL for trace operators
|
||||
- Uses `traceOperatorCTEBuilder` to create Common Table Expressions
|
||||
|
||||
**Features**:
|
||||
- Parses operator expressions (e.g., "A AND B")
|
||||
- Creates CTEs for each referenced query
|
||||
- Combines results using INTERSECT, UNION, EXCEPT
|
||||
|
||||
### 5. ClickHouse Reader (Trace-Specific Methods)
|
||||
|
||||
**Location**: `pkg/query-service/app/clickhouseReader/reader.go`
|
||||
|
||||
**Purpose**: Direct trace data retrieval and processing (bypasses query builder)
|
||||
|
||||
**Key Methods**:
|
||||
- `GetWaterfallSpansForTraceWithMetadata()`: Waterfall view data
|
||||
- `GetFlamegraphSpansForTrace()`: Flamegraph view data
|
||||
- `SearchTraces()`: Legacy trace search (still used for some flows)
|
||||
- `GetMinAndMaxTimestampForTraceID()`: Time range optimization helper
|
||||
|
||||
**Caching**: Implements 5-minute cache for trace detail views
|
||||
|
||||
**Note**: These methods are used for trace-specific visualizations. For general trace queries, use the Query Range API.
|
||||
|
||||
---
|
||||
|
||||
## Query Building System
|
||||
|
||||
> **Note**: For general query building concepts and patterns, see the [Query Range API Documentation](./QUERY_RANGE_API.md). This section covers trace-specific aspects.
|
||||
|
||||
### Trace Query Builder Structure
|
||||
|
||||
A trace query consists of:
|
||||
|
||||
```go
|
||||
QueryBuilderQuery[TraceAggregation] {
|
||||
Name: "query_name",
|
||||
Signal: SignalTraces,
|
||||
Filter: &Filter {
|
||||
Expression: "service.name = 'api' AND duration_nano > 1000000"
|
||||
},
|
||||
Aggregations: []TraceAggregation {
|
||||
{Expression: "count()", Alias: "total"},
|
||||
{Expression: "avg(duration_nano)", Alias: "avg_duration"},
|
||||
{Expression: "p99(duration_nano)", Alias: "p99"},
|
||||
},
|
||||
GroupBy: []GroupByKey {
|
||||
{TelemetryFieldKey: {Name: "service.name", ...}},
|
||||
},
|
||||
Order: []OrderBy {...},
|
||||
Limit: 100,
|
||||
}
|
||||
```
|
||||
|
||||
### Trace-Specific SQL Generation Process
|
||||
|
||||
1. **Field Resolution**:
|
||||
- Resolve trace field names using `traceFieldMapper`
|
||||
- Handle intrinsic, calculated, and attribute fields
|
||||
- Map to ClickHouse columns (e.g., `service.name` → `resource_string_service$$name`)
|
||||
|
||||
2. **Time Range Optimization**:
|
||||
- If `trace_id` filter present, query `trace_summary` first
|
||||
- Narrow time range based on trace start/end times
|
||||
- Reduces data scanned significantly
|
||||
|
||||
3. **Filter Building**:
|
||||
- Convert filter expression using `traceConditionBuilder`
|
||||
- Handle attribute filters (attributes_string, attributes_number, attributes_bool)
|
||||
- Handle resource filters (resources_string)
|
||||
- Handle intrinsic field filters
|
||||
|
||||
4. **Aggregation Building**:
|
||||
- Build SELECT with trace aggregations
|
||||
- Support trace-specific functions (count, avg, p99, etc. on duration_nano)
|
||||
|
||||
5. **Group By Building**:
|
||||
- Add GROUP BY clause with trace fields
|
||||
- Support grouping by service.name, operation name, etc.
|
||||
|
||||
6. **Order Building**:
|
||||
- Add ORDER BY clause
|
||||
- Support ordering by duration, timestamp, etc.
|
||||
|
||||
7. **Limit/Offset**:
|
||||
- Add pagination
|
||||
|
||||
### Example Generated SQL
|
||||
|
||||
For query: `count() WHERE service.name = 'api' GROUP BY service.name`
|
||||
|
||||
```sql
|
||||
SELECT
|
||||
count() AS total,
|
||||
resource_string_service$$name AS service_name
|
||||
FROM signoz_traces.distributed_signoz_index_v3
|
||||
WHERE
|
||||
timestamp >= toDateTime64(1234567890/1e9, 9)
|
||||
AND timestamp <= toDateTime64(1234567899/1e9, 9)
|
||||
AND ts_bucket_start >= toDateTime64(1234567890/1e9, 9)
|
||||
AND ts_bucket_start <= toDateTime64(1234567899/1e9, 9)
|
||||
AND resource_string_service$$name = 'api'
|
||||
GROUP BY resource_string_service$$name
|
||||
```
|
||||
|
||||
**Note**: The query uses `ts_bucket_start` for efficient time filtering (partitioning column).
|
||||
|
||||
---
|
||||
|
||||
## Storage Schema
|
||||
|
||||
### Main Tables
|
||||
|
||||
**Location**: `pkg/telemetrytraces/tables.go`
|
||||
|
||||
#### 1. `distributed_signoz_index_v3`
|
||||
|
||||
Main span index table. Stores all span data.
|
||||
|
||||
**Key Columns**:
|
||||
- `timestamp`: Span timestamp
|
||||
- `duration_nano`: Span duration
|
||||
- `span_id`, `trace_id`: Identifiers
|
||||
- `has_error`: Error indicator
|
||||
- `kind`: Span kind
|
||||
- `name`: Operation name
|
||||
- `attributes_string`, `attributes_number`, `attributes_bool`: Attributes
|
||||
- `resources_string`: Resource attributes
|
||||
- `events`: Span events
|
||||
- `status_code_string`, `status_message`: Status
|
||||
- `ts_bucket_start`: Time bucket for partitioning
|
||||
|
||||
#### 2. `distributed_trace_summary`
|
||||
|
||||
Trace-level summary for quick lookups.
|
||||
|
||||
**Columns**:
|
||||
- `trace_id`: Trace identifier
|
||||
- `start`: Earliest span timestamp
|
||||
- `end`: Latest span timestamp
|
||||
- `num_spans`: Total span count
|
||||
|
||||
#### 3. `distributed_tag_attributes_v2`
|
||||
|
||||
Metadata table for attribute keys.
|
||||
|
||||
**Purpose**: Stores available attribute keys for autocomplete
|
||||
|
||||
#### 4. `distributed_span_attributes_keys`
|
||||
|
||||
Span attribute keys metadata.
|
||||
|
||||
**Purpose**: Tracks which attributes exist in spans
|
||||
|
||||
### Database
|
||||
|
||||
All trace tables are in the `signoz_traces` database.
|
||||
|
||||
---
|
||||
|
||||
## Extending the Traces Module
|
||||
|
||||
### Adding a New Calculated Field
|
||||
|
||||
1. **Define Field in Constants** (`pkg/telemetrytraces/const.go`):
|
||||
```go
|
||||
CalculatedFields = map[string]telemetrytypes.TelemetryFieldKey{
|
||||
"my_new_field": {
|
||||
Name: "my_new_field",
|
||||
Description: "Description of the field",
|
||||
Signal: telemetrytypes.SignalTraces,
|
||||
FieldContext: telemetrytypes.FieldContextSpan,
|
||||
FieldDataType: telemetrytypes.FieldDataTypeString,
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
2. **Implement Field Mapping** (`pkg/telemetrytraces/field_mapper.go`):
|
||||
```go
|
||||
func (fm *fieldMapper) MapField(field telemetrytypes.TelemetryFieldKey) (string, error) {
|
||||
if field.Name == "my_new_field" {
|
||||
// Return ClickHouse expression
|
||||
return "attributes_string['my.attribute.key']", nil
|
||||
}
|
||||
// ... existing mappings
|
||||
}
|
||||
```
|
||||
|
||||
3. **Update Condition Builder** (if needed for filtering):
|
||||
```go
|
||||
// In condition_builder.go, add support for your field
|
||||
```
|
||||
|
||||
### Adding a New API Endpoint
|
||||
|
||||
1. **Add Handler Method** (`pkg/query-service/app/http_handler.go`):
|
||||
```go
|
||||
func (aH *APIHandler) MyNewTraceHandler(w http.ResponseWriter, r *http.Request) {
|
||||
// Extract parameters
|
||||
// Call reader or querier
|
||||
// Return response
|
||||
}
|
||||
```
|
||||
|
||||
2. **Register Route** (in `RegisterRoutes` or separate method):
|
||||
```go
|
||||
router.HandleFunc("/api/v2/traces/my-endpoint",
|
||||
am.ViewAccess(aH.MyNewTraceHandler)).Methods(http.MethodPost)
|
||||
```
|
||||
|
||||
3. **Implement Logic**:
|
||||
- Add to `ClickHouseReader` if direct DB access needed
|
||||
- Or use `Querier` for query builder queries
|
||||
|
||||
### Adding a New Aggregation Function
|
||||
|
||||
1. **Update Aggregation Rewriter** (`pkg/querybuilder/agg_expr_rewriter.go`):
|
||||
```go
|
||||
func (r *aggExprRewriter) RewriteAggregation(expr string) (string, error) {
|
||||
// Add parsing for your function
|
||||
if strings.HasPrefix(expr, "my_function(") {
|
||||
// Return ClickHouse SQL expression
|
||||
return "myClickHouseFunction(...)", nil
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
2. **Update Statement Builder** (if special handling needed):
|
||||
```go
|
||||
// In statement_builder.go, add special case if needed
|
||||
```
|
||||
|
||||
### Adding Trace Operator Support
|
||||
|
||||
Trace operators are already extensible. To add a new operator:
|
||||
|
||||
1. **Update Grammar** (`grammar/TraceOperatorGrammar.g4`):
|
||||
```antlr
|
||||
operator: AND | OR | NOT | MY_NEW_OPERATOR;
|
||||
```
|
||||
|
||||
2. **Update CTE Builder** (`pkg/telemetrytraces/trace_operator_cte_builder.go`):
|
||||
```go
|
||||
func (b *traceOperatorCTEBuilder) buildOperatorQuery(op TraceOperatorType) string {
|
||||
switch op {
|
||||
case TraceOperatorTypeMyNewOperator:
|
||||
return "MY_CLICKHOUSE_OPERATION"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Pattern 1: Query with Filter
|
||||
|
||||
```go
|
||||
query := qbtypes.QueryBuilderQuery[qbtypes.TraceAggregation]{
|
||||
Name: "filtered_traces",
|
||||
Signal: telemetrytypes.SignalTraces,
|
||||
Filter: &qbtypes.Filter{
|
||||
Expression: "service.name = 'api' AND duration_nano > 1000000",
|
||||
},
|
||||
Aggregations: []qbtypes.TraceAggregation{
|
||||
{Expression: "count()", Alias: "total"},
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
### Pattern 2: Time Series Query
|
||||
|
||||
```go
|
||||
query := qbtypes.QueryBuilderQuery[qbtypes.TraceAggregation]{
|
||||
Name: "time_series",
|
||||
Signal: telemetrytypes.SignalTraces,
|
||||
Aggregations: []qbtypes.TraceAggregation{
|
||||
{Expression: "avg(duration_nano)", Alias: "avg_duration"},
|
||||
},
|
||||
GroupBy: []qbtypes.GroupByKey{
|
||||
{TelemetryFieldKey: telemetrytypes.TelemetryFieldKey{
|
||||
Name: "service.name",
|
||||
FieldContext: telemetrytypes.FieldContextResource,
|
||||
}},
|
||||
},
|
||||
StepInterval: qbtypes.Step{Duration: time.Minute},
|
||||
}
|
||||
```
|
||||
|
||||
### Pattern 3: Trace Operator Query
|
||||
|
||||
```go
|
||||
query := qbtypes.QueryBuilderTraceOperator{
|
||||
Name: "operator_query",
|
||||
Expression: "A AND B", // A and B are query names
|
||||
Filter: &qbtypes.Filter{
|
||||
Expression: "duration_nano > 5000000",
|
||||
},
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### Caching
|
||||
|
||||
- **Trace Detail Views**: 5-minute cache for waterfall/flamegraph
|
||||
- **Query Results**: Bucket-based caching in querier
|
||||
- **Metadata**: Cached attribute keys and field metadata
|
||||
|
||||
### Query Optimization
|
||||
|
||||
1. **Time Range Optimization**: When `trace_id` is in filter, query `trace_summary` first to narrow time range
|
||||
2. **Index Usage**: Queries use `ts_bucket_start` for time filtering
|
||||
3. **Limit Enforcement**: Waterfall/flamegraph have span limits (500/50)
|
||||
|
||||
### Best Practices
|
||||
|
||||
1. **Use Query Builder**: Prefer query builder over raw SQL for better optimization
|
||||
2. **Limit Time Ranges**: Always specify reasonable time ranges
|
||||
3. **Use Aggregations**: For large datasets, use aggregations instead of raw data
|
||||
4. **Cache Awareness**: Be mindful of cache TTLs when testing
|
||||
|
||||
---
|
||||
|
||||
## References
|
||||
|
||||
### Key Files
|
||||
|
||||
- `pkg/telemetrytraces/` - Core trace query building
|
||||
- `statement_builder.go` - Trace SQL generation
|
||||
- `field_mapper.go` - Trace field mapping
|
||||
- `condition_builder.go` - Trace filter building
|
||||
- `trace_operator_statement_builder.go` - Trace operator SQL
|
||||
- `pkg/query-service/app/clickhouseReader/reader.go` - Direct trace access
|
||||
- `pkg/query-service/app/http_handler.go` - API handlers
|
||||
- `pkg/query-service/model/trace.go` - Data models
|
||||
|
||||
### Related Documentation
|
||||
|
||||
- [Query Range API Documentation](./QUERY_RANGE_API.md) - Common query_range API details
|
||||
- [OpenTelemetry Specification](https://opentelemetry.io/docs/specs/)
|
||||
- [ClickHouse Documentation](https://clickhouse.com/docs)
|
||||
- [Query Builder Guide](../contributing/go/query-builder.md)
|
||||
|
||||
---
|
||||
|
||||
## Contributing
|
||||
|
||||
When contributing to the traces module:
|
||||
|
||||
1. **Follow Existing Patterns**: Match the style of existing code
|
||||
2. **Add Tests**: Include unit tests for new functionality
|
||||
3. **Update Documentation**: Update this doc for significant changes
|
||||
4. **Consider Performance**: Optimize queries and use caching appropriately
|
||||
5. **Handle Errors**: Provide meaningful error messages
|
||||
|
||||
For questions or help, reach out to the maintainers or open an issue.
|
||||
282
ee/gateway/httpgateway/provider.go
Normal file
282
ee/gateway/httpgateway/provider.go
Normal file
@@ -0,0 +1,282 @@
|
||||
package httpgateway
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
"encoding/json"
|
||||
"io"
|
||||
"net/http"
|
||||
"net/url"
|
||||
"strconv"
|
||||
"time"
|
||||
|
||||
"github.com/SigNoz/signoz/pkg/errors"
|
||||
"github.com/SigNoz/signoz/pkg/factory"
|
||||
"github.com/SigNoz/signoz/pkg/gateway"
|
||||
"github.com/SigNoz/signoz/pkg/http/client"
|
||||
"github.com/SigNoz/signoz/pkg/licensing"
|
||||
"github.com/SigNoz/signoz/pkg/types/gatewaytypes"
|
||||
"github.com/SigNoz/signoz/pkg/valuer"
|
||||
"github.com/tidwall/gjson"
|
||||
)
|
||||
|
||||
type Provider struct {
|
||||
settings factory.ScopedProviderSettings
|
||||
config gateway.Config
|
||||
httpClient *client.Client
|
||||
licensing licensing.Licensing
|
||||
}
|
||||
|
||||
func NewProviderFactory(licensing licensing.Licensing) factory.ProviderFactory[gateway.Gateway, gateway.Config] {
|
||||
return factory.NewProviderFactory(factory.MustNewName("http"), func(ctx context.Context, ps factory.ProviderSettings, c gateway.Config) (gateway.Gateway, error) {
|
||||
return New(ctx, ps, c, licensing)
|
||||
})
|
||||
}
|
||||
|
||||
func New(ctx context.Context, providerSettings factory.ProviderSettings, config gateway.Config, licensing licensing.Licensing) (gateway.Gateway, error) {
|
||||
settings := factory.NewScopedProviderSettings(providerSettings, "github.com/SigNoz/signoz/ee/gateway/httpgateway")
|
||||
|
||||
httpClient, err := client.New(
|
||||
settings.Logger(),
|
||||
providerSettings.TracerProvider,
|
||||
providerSettings.MeterProvider,
|
||||
client.WithRequestResponseLog(true),
|
||||
client.WithRetryCount(3),
|
||||
)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &Provider{
|
||||
settings: settings,
|
||||
config: config,
|
||||
httpClient: httpClient,
|
||||
licensing: licensing,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (provider *Provider) GetIngestionKeys(ctx context.Context, orgID valuer.UUID, page, perPage int) (*gatewaytypes.GettableIngestionKeys, error) {
|
||||
qParams := url.Values{}
|
||||
qParams.Add("page", strconv.Itoa(page))
|
||||
qParams.Add("per_page", strconv.Itoa(perPage))
|
||||
|
||||
responseBody, err := provider.do(ctx, orgID, http.MethodGet, "/v1/workspaces/me/keys", qParams, nil)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var ingestionKeys []gatewaytypes.IngestionKey
|
||||
if err := json.Unmarshal([]byte(gjson.GetBytes(responseBody, "data").String()), &ingestionKeys); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var pagination gatewaytypes.Pagination
|
||||
if err := json.Unmarshal([]byte(gjson.GetBytes(responseBody, "_pagination").String()), &pagination); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &gatewaytypes.GettableIngestionKeys{
|
||||
Keys: ingestionKeys,
|
||||
Pagination: pagination,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (provider *Provider) SearchIngestionKeysByName(ctx context.Context, orgID valuer.UUID, name string, page, perPage int) (*gatewaytypes.GettableIngestionKeys, error) {
|
||||
qParams := url.Values{}
|
||||
qParams.Add("name", name)
|
||||
qParams.Add("page", strconv.Itoa(page))
|
||||
qParams.Add("per_page", strconv.Itoa(perPage))
|
||||
|
||||
responseBody, err := provider.do(ctx, orgID, http.MethodGet, "/v1/workspaces/me/keys/search", qParams, nil)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var ingestionKeys []gatewaytypes.IngestionKey
|
||||
if err := json.Unmarshal([]byte(gjson.GetBytes(responseBody, "data").String()), &ingestionKeys); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var pagination gatewaytypes.Pagination
|
||||
if err := json.Unmarshal([]byte(gjson.GetBytes(responseBody, "_pagination").String()), &pagination); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &gatewaytypes.GettableIngestionKeys{
|
||||
Keys: ingestionKeys,
|
||||
Pagination: pagination,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (provider *Provider) CreateIngestionKey(ctx context.Context, orgID valuer.UUID, name string, tags []string, expiresAt time.Time) (*gatewaytypes.GettableCreatedIngestionKey, error) {
|
||||
requestBody := gatewaytypes.PostableIngestionKey{
|
||||
Name: name,
|
||||
Tags: tags,
|
||||
ExpiresAt: expiresAt,
|
||||
}
|
||||
requestBodyBytes, err := json.Marshal(requestBody)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
responseBody, err := provider.do(ctx, orgID, http.MethodPost, "/v1/workspaces/me/keys", nil, requestBodyBytes)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var createdKeyResponse gatewaytypes.GettableCreatedIngestionKey
|
||||
if err := json.Unmarshal([]byte(gjson.GetBytes(responseBody, "data").String()), &createdKeyResponse); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &createdKeyResponse, nil
|
||||
}
|
||||
|
||||
func (provider *Provider) UpdateIngestionKey(ctx context.Context, orgID valuer.UUID, keyID string, name string, tags []string, expiresAt time.Time) error {
|
||||
requestBody := gatewaytypes.PostableIngestionKey{
|
||||
Name: name,
|
||||
Tags: tags,
|
||||
ExpiresAt: expiresAt,
|
||||
}
|
||||
requestBodyBytes, err := json.Marshal(requestBody)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
_, err = provider.do(ctx, orgID, http.MethodPatch, "/v1/workspaces/me/keys/"+keyID, nil, requestBodyBytes)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (provider *Provider) DeleteIngestionKey(ctx context.Context, orgID valuer.UUID, keyID string) error {
|
||||
_, err := provider.do(ctx, orgID, http.MethodDelete, "/v1/workspaces/me/keys/"+keyID, nil, nil)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (provider *Provider) CreateIngestionKeyLimit(ctx context.Context, orgID valuer.UUID, keyID string, signal string, limitConfig gatewaytypes.LimitConfig, tags []string) (*gatewaytypes.GettableCreatedIngestionKeyLimit, error) {
|
||||
requestBody := gatewaytypes.PostableIngestionKeyLimit{
|
||||
Signal: signal,
|
||||
Config: limitConfig,
|
||||
Tags: tags,
|
||||
}
|
||||
requestBodyBytes, err := json.Marshal(requestBody)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
responseBody, err := provider.do(ctx, orgID, http.MethodPost, "/v1/workspaces/me/keys/"+keyID+"/limits", nil, requestBodyBytes)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var createdIngestionKeyLimitResponse gatewaytypes.GettableCreatedIngestionKeyLimit
|
||||
if err := json.Unmarshal([]byte(gjson.GetBytes(responseBody, "data").String()), &createdIngestionKeyLimitResponse); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &createdIngestionKeyLimitResponse, nil
|
||||
}
|
||||
|
||||
func (provider *Provider) UpdateIngestionKeyLimit(ctx context.Context, orgID valuer.UUID, limitID string, limitConfig gatewaytypes.LimitConfig, tags []string) error {
|
||||
requestBody := gatewaytypes.UpdatableIngestionKeyLimit{
|
||||
Config: limitConfig,
|
||||
Tags: tags,
|
||||
}
|
||||
requestBodyBytes, err := json.Marshal(requestBody)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
_, err = provider.do(ctx, orgID, http.MethodPatch, "/v1/workspaces/me/limits/"+limitID, nil, requestBodyBytes)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (provider *Provider) DeleteIngestionKeyLimit(ctx context.Context, orgID valuer.UUID, limitID string) error {
|
||||
_, err := provider.do(ctx, orgID, http.MethodDelete, "/v1/workspaces/me/limits/"+limitID, nil, nil)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (provider *Provider) do(ctx context.Context, orgID valuer.UUID, method string, path string, queryParams url.Values, body []byte) ([]byte, error) {
|
||||
license, err := provider.licensing.GetActive(ctx, orgID)
|
||||
if err != nil {
|
||||
return nil, errors.New(errors.TypeLicenseUnavailable, errors.CodeLicenseUnavailable, "no valid license found").WithAdditional("this feature requires a valid license").WithAdditional(err.Error())
|
||||
}
|
||||
|
||||
// build url
|
||||
requestURL := provider.config.URL.JoinPath(path)
|
||||
|
||||
// add query params to the url
|
||||
if queryParams != nil {
|
||||
requestURL.RawQuery = queryParams.Encode()
|
||||
}
|
||||
|
||||
// build request
|
||||
request, err := http.NewRequestWithContext(ctx, method, requestURL.String(), bytes.NewBuffer(body))
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// add headers needed to call gateway
|
||||
request.Header.Set("Content-Type", "application/json")
|
||||
request.Header.Set("X-Signoz-Cloud-Api-Key", license.Key)
|
||||
request.Header.Set("X-Consumer-Username", "lid:00000000-0000-0000-0000-000000000000")
|
||||
request.Header.Set("X-Consumer-Groups", "ns:default")
|
||||
|
||||
// execute request
|
||||
response, err := provider.httpClient.Do(request)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// read response
|
||||
defer response.Body.Close()
|
||||
responseBody, err := io.ReadAll(response.Body)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// only 2XX
|
||||
if response.StatusCode/100 == 2 {
|
||||
return responseBody, nil
|
||||
}
|
||||
|
||||
errorMessage := gjson.GetBytes(responseBody, "error").String()
|
||||
if errorMessage == "" {
|
||||
errorMessage = "an unknown error occurred"
|
||||
}
|
||||
|
||||
// return error for non 2XX
|
||||
return nil, provider.errFromStatusCode(response.StatusCode, errorMessage)
|
||||
}
|
||||
|
||||
func (provider *Provider) errFromStatusCode(code int, errorMessage string) error {
|
||||
switch code {
|
||||
case http.StatusBadRequest:
|
||||
return errors.New(errors.TypeInvalidInput, errors.CodeInvalidInput, errorMessage)
|
||||
case http.StatusUnauthorized:
|
||||
return errors.New(errors.TypeUnauthenticated, errors.CodeUnauthenticated, errorMessage)
|
||||
case http.StatusForbidden:
|
||||
return errors.New(errors.TypeForbidden, errors.CodeForbidden, errorMessage)
|
||||
case http.StatusNotFound:
|
||||
return errors.New(errors.TypeNotFound, errors.CodeNotFound, errorMessage)
|
||||
case http.StatusConflict:
|
||||
return errors.New(errors.TypeAlreadyExists, errors.CodeAlreadyExists, errorMessage)
|
||||
}
|
||||
|
||||
return errors.New(errors.TypeInternal, errors.CodeInternal, errorMessage)
|
||||
}
|
||||
@@ -9,6 +9,8 @@ import (
|
||||
"github.com/SigNoz/signoz/ee/query-service/integrations/gateway"
|
||||
"github.com/SigNoz/signoz/ee/query-service/usage"
|
||||
"github.com/SigNoz/signoz/pkg/alertmanager"
|
||||
"github.com/SigNoz/signoz/pkg/apis/fields"
|
||||
"github.com/SigNoz/signoz/pkg/global"
|
||||
"github.com/SigNoz/signoz/pkg/http/middleware"
|
||||
querierAPI "github.com/SigNoz/signoz/pkg/querier"
|
||||
baseapp "github.com/SigNoz/signoz/pkg/query-service/app"
|
||||
@@ -35,6 +37,7 @@ type APIHandlerOptions struct {
|
||||
GatewayUrl string
|
||||
// Querier Influx Interval
|
||||
FluxInterval time.Duration
|
||||
GlobalConfig global.Config
|
||||
}
|
||||
|
||||
type APIHandler struct {
|
||||
@@ -53,6 +56,7 @@ func NewAPIHandler(opts APIHandlerOptions, signoz *signoz.SigNoz) (*APIHandler,
|
||||
FluxInterval: opts.FluxInterval,
|
||||
AlertmanagerAPI: alertmanager.NewAPI(signoz.Alertmanager),
|
||||
LicensingAPI: httplicensing.NewLicensingAPI(signoz.Licensing),
|
||||
FieldsAPI: fields.NewAPI(signoz.Instrumentation.ToProviderSettings(), signoz.TelemetryStore),
|
||||
Signoz: signoz,
|
||||
QuerierAPI: querierAPI.NewAPI(signoz.Instrumentation.ToProviderSettings(), signoz.Querier, signoz.Analytics),
|
||||
QueryParserAPI: queryparser.NewAPI(signoz.Instrumentation.ToProviderSettings(), signoz.QueryParser),
|
||||
|
||||
@@ -76,7 +76,7 @@ func (ah *APIHandler) CloudIntegrationsGenerateConnectionParams(w http.ResponseW
|
||||
return
|
||||
}
|
||||
|
||||
ingestionUrl, signozApiUrl, apiErr := ah.getIngestionUrlAndSigNozAPIUrl(r.Context(), license.Key)
|
||||
signozApiUrl, apiErr := ah.getIngestionUrlAndSigNozAPIUrl(r.Context(), license.Key)
|
||||
if apiErr != nil {
|
||||
RespondError(w, basemodel.WrapApiError(
|
||||
apiErr, "couldn't deduce ingestion url and signoz api url",
|
||||
@@ -84,7 +84,7 @@ func (ah *APIHandler) CloudIntegrationsGenerateConnectionParams(w http.ResponseW
|
||||
return
|
||||
}
|
||||
|
||||
result.IngestionUrl = ingestionUrl
|
||||
result.IngestionUrl = ah.opts.GlobalConfig.IngestionURL.String()
|
||||
result.SigNozAPIUrl = signozApiUrl
|
||||
|
||||
gatewayUrl := ah.opts.GatewayUrl
|
||||
@@ -186,7 +186,7 @@ func (ah *APIHandler) getOrCreateCloudIntegrationUser(
|
||||
}
|
||||
|
||||
func (ah *APIHandler) getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licenseKey string) (
|
||||
string, string, *basemodel.ApiError,
|
||||
string, *basemodel.ApiError,
|
||||
) {
|
||||
// TODO: remove this struct from here
|
||||
type deploymentResponse struct {
|
||||
@@ -200,7 +200,7 @@ func (ah *APIHandler) getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licens
|
||||
|
||||
respBytes, err := ah.Signoz.Zeus.GetDeployment(ctx, licenseKey)
|
||||
if err != nil {
|
||||
return "", "", basemodel.InternalError(fmt.Errorf(
|
||||
return "", basemodel.InternalError(fmt.Errorf(
|
||||
"couldn't query for deployment info: error: %w", err,
|
||||
))
|
||||
}
|
||||
@@ -209,7 +209,7 @@ func (ah *APIHandler) getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licens
|
||||
|
||||
err = json.Unmarshal(respBytes, resp)
|
||||
if err != nil {
|
||||
return "", "", basemodel.InternalError(fmt.Errorf(
|
||||
return "", basemodel.InternalError(fmt.Errorf(
|
||||
"couldn't unmarshal deployment info response: error: %w", err,
|
||||
))
|
||||
}
|
||||
@@ -219,16 +219,14 @@ func (ah *APIHandler) getIngestionUrlAndSigNozAPIUrl(ctx context.Context, licens
|
||||
|
||||
if len(regionDns) < 1 || len(deploymentName) < 1 {
|
||||
// Fail early if actual response structure and expectation here ever diverge
|
||||
return "", "", basemodel.InternalError(fmt.Errorf(
|
||||
return "", basemodel.InternalError(fmt.Errorf(
|
||||
"deployment info response not in expected shape. couldn't determine region dns and deployment name",
|
||||
))
|
||||
}
|
||||
|
||||
ingestionUrl := fmt.Sprintf("https://ingest.%s", regionDns)
|
||||
|
||||
signozApiUrl := fmt.Sprintf("https://%s.%s", deploymentName, regionDns)
|
||||
|
||||
return ingestionUrl, signozApiUrl, nil
|
||||
return signozApiUrl, nil
|
||||
}
|
||||
|
||||
type ingestionKey struct {
|
||||
|
||||
@@ -172,6 +172,7 @@ func NewServer(config signoz.Config, signoz *signoz.SigNoz) (*Server, error) {
|
||||
FluxInterval: config.Querier.FluxInterval,
|
||||
Gateway: gatewayProxy,
|
||||
GatewayUrl: config.Gateway.URL.String(),
|
||||
GlobalConfig: config.Global,
|
||||
}
|
||||
|
||||
apiHandler, err := api.NewAPIHandler(apiOpts, signoz)
|
||||
@@ -236,6 +237,7 @@ func (s *Server) createPublicServer(apiHandler *api.APIHandler, web web.Web) (*h
|
||||
apiHandler.RegisterLogsRoutes(r, am)
|
||||
apiHandler.RegisterIntegrationRoutes(r, am)
|
||||
apiHandler.RegisterCloudIntegrationsRoutes(r, am)
|
||||
apiHandler.RegisterFieldsRoutes(r, am)
|
||||
apiHandler.RegisterQueryRangeV3Routes(r, am)
|
||||
apiHandler.RegisterInfraMetricsRoutes(r, am)
|
||||
apiHandler.RegisterQueryRangeV4Routes(r, am)
|
||||
|
||||
@@ -240,11 +240,9 @@ func (r *AnomalyRule) buildAndRunQuery(ctx context.Context, orgID valuer.UUID, t
|
||||
r.logger.InfoContext(ctx, "anomaly scores", "scores", string(scoresJSON))
|
||||
|
||||
for _, series := range queryResult.AnomalyScores {
|
||||
if r.Condition() != nil && r.Condition().RequireMinPoints {
|
||||
if len(series.Points) < r.Condition().RequiredNumPoints {
|
||||
r.logger.InfoContext(ctx, "not enough data points to evaluate series, skipping", "ruleid", r.ID(), "numPoints", len(series.Points), "requiredPoints", r.Condition().RequiredNumPoints)
|
||||
continue
|
||||
}
|
||||
if !r.Condition().ShouldEval(series) {
|
||||
r.logger.InfoContext(ctx, "not enough data points to evaluate series, skipping", "ruleid", r.ID(), "numPoints", len(series.Points), "requiredPoints", r.Condition().RequiredNumPoints)
|
||||
continue
|
||||
}
|
||||
results, err := r.Threshold.Eval(*series, r.Unit(), ruletypes.EvalData{
|
||||
ActiveAlerts: r.ActiveAlertsLabelFP(),
|
||||
@@ -305,11 +303,9 @@ func (r *AnomalyRule) buildAndRunQueryV5(ctx context.Context, orgID valuer.UUID,
|
||||
}
|
||||
|
||||
for _, series := range seriesToProcess {
|
||||
if r.Condition().RequireMinPoints {
|
||||
if len(series.Points) < r.Condition().RequiredNumPoints {
|
||||
r.logger.InfoContext(ctx, "not enough data points to evaluate series, skipping", "ruleid", r.ID(), "numPoints", len(series.Points), "requiredPoints", r.Condition().RequiredNumPoints)
|
||||
continue
|
||||
}
|
||||
if !r.Condition().ShouldEval(series) {
|
||||
r.logger.InfoContext(ctx, "not enough data points to evaluate series, skipping", "ruleid", r.ID(), "numPoints", len(series.Points), "requiredPoints", r.Condition().RequiredNumPoints)
|
||||
continue
|
||||
}
|
||||
results, err := r.Threshold.Eval(*series, r.Unit(), ruletypes.EvalData{
|
||||
ActiveAlerts: r.ActiveAlertsLabelFP(),
|
||||
@@ -323,7 +319,7 @@ func (r *AnomalyRule) buildAndRunQueryV5(ctx context.Context, orgID valuer.UUID,
|
||||
return resultVector, nil
|
||||
}
|
||||
|
||||
func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (interface{}, error) {
|
||||
func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (int, error) {
|
||||
|
||||
prevState := r.State()
|
||||
|
||||
@@ -340,7 +336,7 @@ func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (interface{}, erro
|
||||
res, err = r.buildAndRunQuery(ctx, r.OrgID(), ts)
|
||||
}
|
||||
if err != nil {
|
||||
return nil, err
|
||||
return 0, err
|
||||
}
|
||||
|
||||
r.mtx.Lock()
|
||||
@@ -415,7 +411,7 @@ func (r *AnomalyRule) Eval(ctx context.Context, ts time.Time) (interface{}, erro
|
||||
if _, ok := alerts[h]; ok {
|
||||
r.logger.ErrorContext(ctx, "the alert query returns duplicate records", "rule_id", r.ID(), "alert", alerts[h])
|
||||
err = fmt.Errorf("duplicate alert found, vector contains metrics with the same labelset after applying alert labels")
|
||||
return nil, err
|
||||
return 0, err
|
||||
}
|
||||
|
||||
alerts[h] = &ruletypes.Alert{
|
||||
|
||||
@@ -10,7 +10,7 @@ import (
|
||||
basemodel "github.com/SigNoz/signoz/pkg/query-service/model"
|
||||
baserules "github.com/SigNoz/signoz/pkg/query-service/rules"
|
||||
"github.com/SigNoz/signoz/pkg/query-service/utils/labels"
|
||||
ruletypes "github.com/SigNoz/signoz/pkg/types/ruletypes"
|
||||
"github.com/SigNoz/signoz/pkg/types/ruletypes"
|
||||
"github.com/SigNoz/signoz/pkg/valuer"
|
||||
"github.com/google/uuid"
|
||||
"go.uber.org/zap"
|
||||
@@ -47,7 +47,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
|
||||
|
||||
rules = append(rules, tr)
|
||||
|
||||
// create ch rule task for evalution
|
||||
// create ch rule task for evaluation
|
||||
task = newTask(baserules.TaskTypeCh, opts.TaskName, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
|
||||
|
||||
} else if opts.Rule.RuleType == ruletypes.RuleTypeProm {
|
||||
@@ -71,7 +71,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
|
||||
|
||||
rules = append(rules, pr)
|
||||
|
||||
// create promql rule task for evalution
|
||||
// create promql rule task for evaluation
|
||||
task = newTask(baserules.TaskTypeProm, opts.TaskName, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
|
||||
|
||||
} else if opts.Rule.RuleType == ruletypes.RuleTypeAnomaly {
|
||||
@@ -95,7 +95,7 @@ func PrepareTaskFunc(opts baserules.PrepareTaskOptions) (baserules.Task, error)
|
||||
|
||||
rules = append(rules, ar)
|
||||
|
||||
// create anomaly rule task for evalution
|
||||
// create anomaly rule task for evaluation
|
||||
task = newTask(baserules.TaskTypeCh, opts.TaskName, time.Duration(evaluation.GetFrequency()), rules, opts.ManagerOpts, opts.NotifyFunc, opts.MaintenanceStore, opts.OrgID)
|
||||
|
||||
} else {
|
||||
@@ -203,16 +203,12 @@ func TestNotification(opts baserules.PrepareTestRuleOptions) (int, *basemodel.Ap
|
||||
// set timestamp to current utc time
|
||||
ts := time.Now().UTC()
|
||||
|
||||
count, err := rule.Eval(ctx, ts)
|
||||
alertsFound, err := rule.Eval(ctx, ts)
|
||||
if err != nil {
|
||||
zap.L().Error("evaluating rule failed", zap.String("rule", rule.Name()), zap.Error(err))
|
||||
return 0, basemodel.InternalError(fmt.Errorf("rule evaluation failed"))
|
||||
}
|
||||
alertsFound, ok := count.(int)
|
||||
if !ok {
|
||||
return 0, basemodel.InternalError(fmt.Errorf("something went wrong"))
|
||||
}
|
||||
rule.SendAlerts(ctx, ts, 0, time.Duration(1*time.Minute), opts.NotifyFunc)
|
||||
rule.SendAlerts(ctx, ts, 0, time.Minute, opts.NotifyFunc)
|
||||
|
||||
return alertsFound, nil
|
||||
}
|
||||
|
||||
@@ -9,7 +9,7 @@ import (
|
||||
"time"
|
||||
|
||||
"github.com/SigNoz/signoz/pkg/alertmanager"
|
||||
alertmanagermock "github.com/SigNoz/signoz/pkg/alertmanager/mocks"
|
||||
alertmanagermock "github.com/SigNoz/signoz/pkg/alertmanager/alertmanagertest"
|
||||
"github.com/SigNoz/signoz/pkg/instrumentation/instrumentationtest"
|
||||
"github.com/SigNoz/signoz/pkg/prometheus"
|
||||
"github.com/SigNoz/signoz/pkg/prometheus/prometheustest"
|
||||
|
||||
@@ -2,4 +2,6 @@ node_modules
|
||||
build
|
||||
*.typegen.ts
|
||||
i18-generate-hash.js
|
||||
src/parser/TraceOperatorParser/**
|
||||
src/parser/TraceOperatorParser/**
|
||||
|
||||
orval.config.ts
|
||||
@@ -1,3 +1,6 @@
|
||||
/**
|
||||
* ESLint Configuration for SigNoz Frontend
|
||||
*/
|
||||
module.exports = {
|
||||
ignorePatterns: ['src/parser/*.ts', 'scripts/update-registry.js'],
|
||||
env: {
|
||||
@@ -10,11 +13,9 @@ module.exports = {
|
||||
'eslint:recommended',
|
||||
'plugin:react/recommended',
|
||||
'plugin:@typescript-eslint/recommended',
|
||||
'plugin:@typescript-eslint/eslint-recommended',
|
||||
'plugin:react-hooks/recommended',
|
||||
'plugin:prettier/recommended',
|
||||
'plugin:sonarjs/recommended',
|
||||
'plugin:import/errors',
|
||||
'plugin:import/warnings',
|
||||
'plugin:react/jsx-runtime',
|
||||
],
|
||||
parser: '@typescript-eslint/parser',
|
||||
@@ -23,17 +24,21 @@ module.exports = {
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
ecmaVersion: 12,
|
||||
ecmaVersion: 2021,
|
||||
sourceType: 'module',
|
||||
},
|
||||
plugins: [
|
||||
'react',
|
||||
'@typescript-eslint',
|
||||
'simple-import-sort',
|
||||
'react-hooks',
|
||||
'prettier',
|
||||
'jest',
|
||||
'jsx-a11y',
|
||||
'react', // React-specific rules
|
||||
'@typescript-eslint', // TypeScript linting
|
||||
'simple-import-sort', // Auto-sort imports
|
||||
'react-hooks', // React Hooks rules
|
||||
'prettier', // Code formatting
|
||||
'jest', // Jest test rules
|
||||
'jsx-a11y', // Accessibility rules
|
||||
'import', // Import/export linting
|
||||
'sonarjs', // Code quality/complexity
|
||||
// TODO: Uncomment after running: yarn add -D eslint-plugin-spellcheck
|
||||
// 'spellcheck', // Correct spellings
|
||||
],
|
||||
settings: {
|
||||
react: {
|
||||
@@ -47,81 +52,110 @@ module.exports = {
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
// Code quality rules
|
||||
'prefer-const': 'error', // Enforces const for variables never reassigned
|
||||
'no-var': 'error', // Disallows var, enforces let/const
|
||||
'no-else-return': ['error', { allowElseIf: false }], // Reduces nesting by disallowing else after return
|
||||
'no-cond-assign': 'error', // Prevents accidental assignment in conditions (if (x = 1) instead of if (x === 1))
|
||||
'no-debugger': 'error', // Disallows debugger statements in production code
|
||||
curly: 'error', // Requires curly braces for all control statements
|
||||
eqeqeq: ['error', 'always', { null: 'ignore' }], // Enforces === and !== (allows == null for null/undefined check)
|
||||
'no-console': ['error', { allow: ['warn', 'error'] }], // Warns on console.log, allows console.warn/error
|
||||
|
||||
// TypeScript rules
|
||||
'@typescript-eslint/explicit-function-return-type': 'error', // Requires explicit return types on functions
|
||||
'@typescript-eslint/no-unused-vars': [
|
||||
// Disallows unused variables/args
|
||||
'error',
|
||||
{
|
||||
argsIgnorePattern: '^_', // Allows unused args prefixed with _ (e.g., _unusedParam)
|
||||
varsIgnorePattern: '^_', // Allows unused vars prefixed with _ (e.g., _unusedVar)
|
||||
},
|
||||
],
|
||||
'@typescript-eslint/no-explicit-any': 'warn', // Warns when using 'any' type (consider upgrading to error)
|
||||
// TODO: Change to 'error' after fixing ~80 empty function placeholders in providers/contexts
|
||||
'@typescript-eslint/no-empty-function': 'off', // Disallows empty function bodies
|
||||
'@typescript-eslint/no-var-requires': 'error', // Disallows require() in TypeScript (use import instead)
|
||||
'@typescript-eslint/ban-ts-comment': 'off', // Allows @ts-ignore comments (sometimes needed for third-party libs)
|
||||
'no-empty-function': 'off', // Disabled in favor of TypeScript version above
|
||||
|
||||
// React rules
|
||||
'react/jsx-filename-extension': [
|
||||
'error',
|
||||
{
|
||||
extensions: ['.tsx', '.js', '.jsx'],
|
||||
extensions: ['.tsx', '.jsx'], // Warns if JSX is used in non-.jsx/.tsx files
|
||||
},
|
||||
],
|
||||
'react/prop-types': 'off',
|
||||
'@typescript-eslint/explicit-function-return-type': 'error',
|
||||
'@typescript-eslint/no-var-requires': 'error',
|
||||
'react/no-array-index-key': 'error',
|
||||
'linebreak-style': [
|
||||
'error',
|
||||
process.env.platform === 'win32' ? 'windows' : 'unix',
|
||||
],
|
||||
'@typescript-eslint/default-param-last': 'off',
|
||||
'react/prop-types': 'off', // Disabled - using TypeScript instead
|
||||
'react/jsx-props-no-spreading': 'off', // Allows {...props} spreading (common in HOCs, forms, wrappers)
|
||||
'react/no-array-index-key': 'error', // Prevents using array index as key (causes bugs when list changes)
|
||||
|
||||
// simple sort error
|
||||
'simple-import-sort/imports': 'error',
|
||||
'simple-import-sort/exports': 'error',
|
||||
|
||||
// hooks
|
||||
'react-hooks/rules-of-hooks': 'error',
|
||||
'react-hooks/exhaustive-deps': 'error',
|
||||
|
||||
'import/prefer-default-export': 'off',
|
||||
'import/extensions': [
|
||||
'error',
|
||||
'ignorePackages',
|
||||
{
|
||||
js: 'never',
|
||||
jsx: 'never',
|
||||
ts: 'never',
|
||||
tsx: 'never',
|
||||
},
|
||||
],
|
||||
'import/no-extraneous-dependencies': ['error', { devDependencies: true }],
|
||||
// Disabled because TypeScript already handles this check more accurately,
|
||||
// and the rule has false positives with type-only imports (e.g., TooltipProps from antd)
|
||||
'import/named': 'off',
|
||||
'no-plusplus': 'off',
|
||||
// Accessibility rules
|
||||
'jsx-a11y/label-has-associated-control': [
|
||||
'error',
|
||||
{
|
||||
required: {
|
||||
some: ['nesting', 'id'],
|
||||
some: ['nesting', 'id'], // Labels must either wrap inputs or use htmlFor/id
|
||||
},
|
||||
},
|
||||
],
|
||||
'jsx-a11y/label-has-for': [
|
||||
'error',
|
||||
{
|
||||
required: {
|
||||
some: ['nesting', 'id'],
|
||||
},
|
||||
},
|
||||
],
|
||||
// Allow empty functions for mocks, default context values, and noop callbacks
|
||||
'@typescript-eslint/no-empty-function': 'off',
|
||||
// Allow underscore prefix for intentionally unused variables (e.g., const { id: _id, ...rest } = props)
|
||||
'@typescript-eslint/no-unused-vars': 'warn',
|
||||
'func-style': ['error', 'declaration', { allowArrowFunctions: true }],
|
||||
'arrow-body-style': ['error', 'as-needed'],
|
||||
|
||||
// eslint rules need to remove
|
||||
'@typescript-eslint/no-shadow': 'off',
|
||||
'import/no-cycle': 'off',
|
||||
// https://typescript-eslint.io/rules/consistent-return/ check the warning for details
|
||||
'consistent-return': 'off',
|
||||
// React Hooks rules
|
||||
'react-hooks/rules-of-hooks': 'error', // Enforces Rules of Hooks (only call at top level)
|
||||
'react-hooks/exhaustive-deps': 'warn', // Warns about missing dependencies in useEffect/useMemo/useCallback
|
||||
|
||||
// Import/export rules
|
||||
'import/extensions': [
|
||||
'error',
|
||||
'ignorePackages',
|
||||
{
|
||||
js: 'never', // Disallows .js extension in imports
|
||||
jsx: 'never', // Disallows .jsx extension in imports
|
||||
ts: 'never', // Disallows .ts extension in imports
|
||||
tsx: 'never', // Disallows .tsx extension in imports
|
||||
},
|
||||
],
|
||||
'import/no-extraneous-dependencies': ['error', { devDependencies: true }], // Prevents importing packages not in package.json
|
||||
// 'import/no-cycle': 'warn', // TODO: Enable later to detect circular dependencies
|
||||
|
||||
// TODO: Enable in separate PR with auto fixes
|
||||
// // Import sorting rules
|
||||
// 'simple-import-sort/imports': [
|
||||
// 'error',
|
||||
// {
|
||||
// groups: [
|
||||
// ['^react', '^@?\\w'], // React first, then external packages
|
||||
// ['^@/'], // Absolute imports with @ alias
|
||||
// ['^\\u0000'], // Side effect imports (import './file')
|
||||
// ['^\\.'], // Relative imports
|
||||
// ['^.+\\.s?css$'], // Style imports
|
||||
// ],
|
||||
// },
|
||||
// ],
|
||||
// 'simple-import-sort/exports': 'error', // Auto-sorts exports
|
||||
|
||||
// Prettier - code formatting
|
||||
'prettier/prettier': [
|
||||
'error',
|
||||
{},
|
||||
{
|
||||
usePrettierrc: true,
|
||||
usePrettierrc: true, // Uses .prettierrc.json for formatting rules
|
||||
},
|
||||
],
|
||||
'react/jsx-props-no-spreading': 'off',
|
||||
|
||||
// SonarJS - code quality and complexity
|
||||
'sonarjs/no-duplicate-string': 'off', // Disabled - can be noisy (enable periodically to check)
|
||||
},
|
||||
overrides: [
|
||||
{
|
||||
files: ['src/api/generated/**/*.ts'],
|
||||
rules: {
|
||||
'@typescript-eslint/explicit-function-return-type': 'off',
|
||||
'@typescript-eslint/explicit-module-boundary-types': 'off',
|
||||
'no-nested-ternary': 'off',
|
||||
'@typescript-eslint/no-unused-vars': 'warn',
|
||||
'sonarjs/no-duplicate-string': 'off',
|
||||
},
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
@@ -7,7 +7,7 @@
|
||||
"semi": true,
|
||||
"printWidth": 80,
|
||||
"bracketSpacing": true,
|
||||
"bracketSameLine": false,
|
||||
"jsxBracketSameLine": false,
|
||||
"arrowParens": "always",
|
||||
"endOfLine": "lf",
|
||||
"quoteProps": "as-needed",
|
||||
|
||||
97
frontend/orval.config.ts
Normal file
97
frontend/orval.config.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
/**
|
||||
* When making changes to this file, remove this the file name from .eslintignore and tsconfig.json
|
||||
* The reason this is required because of the moduleResolution being "node". Changing this is a more detailed effort.
|
||||
* So, until then, we will keep this file ignored for eslint and typescript.
|
||||
*/
|
||||
|
||||
import { defineConfig } from 'orval';
|
||||
|
||||
export default defineConfig({
|
||||
signoz: {
|
||||
input: {
|
||||
target: '../docs/api/openapi.yml',
|
||||
},
|
||||
output: {
|
||||
target: './src/api/generated/services',
|
||||
client: 'react-query',
|
||||
httpClient: 'axios',
|
||||
mode: 'tags-split',
|
||||
prettier: true,
|
||||
headers: true,
|
||||
clean: true,
|
||||
override: {
|
||||
query: {
|
||||
useQuery: true,
|
||||
useMutation: true,
|
||||
useInvalidate: true,
|
||||
signal: true,
|
||||
useOperationIdAsQueryKey: true,
|
||||
},
|
||||
useDates: true,
|
||||
useNamedParameters: true,
|
||||
enumGenerationType: 'enum',
|
||||
mutator: {
|
||||
path: './src/api/index.ts',
|
||||
name: 'GeneratedAPIInstance',
|
||||
},
|
||||
|
||||
jsDoc: {
|
||||
filter: (schema) => {
|
||||
const allowlist = [
|
||||
'type',
|
||||
'format',
|
||||
'maxLength',
|
||||
'minLength',
|
||||
'description',
|
||||
'minimum',
|
||||
'maximum',
|
||||
'exclusiveMinimum',
|
||||
'exclusiveMaximum',
|
||||
'pattern',
|
||||
'nullable',
|
||||
'enum',
|
||||
];
|
||||
return Object.entries(schema || {})
|
||||
.filter(([key]) => allowlist.includes(key))
|
||||
.map(([key, value]: [string, any]) => ({
|
||||
key,
|
||||
value,
|
||||
}))
|
||||
.sort((a, b) => a.key.length - b.key.length);
|
||||
},
|
||||
},
|
||||
|
||||
components: {
|
||||
schemas: {
|
||||
suffix: 'DTO',
|
||||
},
|
||||
responses: {
|
||||
suffix: 'Response',
|
||||
},
|
||||
parameters: {
|
||||
suffix: 'Params',
|
||||
},
|
||||
requestBodies: {
|
||||
suffix: 'Body',
|
||||
},
|
||||
},
|
||||
|
||||
// info is of type InfoObject from openapi spec
|
||||
header: (info: { title: string; version: string }): string[] => [
|
||||
`! Do not edit manually`,
|
||||
`* The file has been auto-generated using Orval for SigNoz`,
|
||||
`* regenerate with 'yarn generate:api'`,
|
||||
...(info.title ? [info.title] : []),
|
||||
...(info.version ? [`OpenAPI spec version: ${info.version}`] : []),
|
||||
],
|
||||
|
||||
// @ts-expect-error
|
||||
// propertySortOrder, urlEncodeParameters, aliasCombinedTypes
|
||||
// are valid options in the document without types
|
||||
propertySortOrder: 'Alphabetical',
|
||||
urlEncodeParameters: true,
|
||||
aliasCombinedTypes: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
@@ -18,7 +18,8 @@
|
||||
"husky:configure": "cd .. && husky install frontend/.husky && cd frontend && chmod ug+x .husky/*",
|
||||
"commitlint": "commitlint --edit $1",
|
||||
"test": "jest",
|
||||
"test:changedsince": "jest --changedSince=main --coverage --silent"
|
||||
"test:changedsince": "jest --changedSince=main --coverage --silent",
|
||||
"generate:api": "orval --config ./orval.config.ts && sh scripts/post-types-generation.sh && prettier --write src/api/generated && (eslint --fix src/api/generated || true)"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=16.15.0"
|
||||
@@ -48,6 +49,7 @@
|
||||
"@signozhq/calendar": "0.0.0",
|
||||
"@signozhq/callout": "0.0.2",
|
||||
"@signozhq/checkbox": "0.0.2",
|
||||
"@signozhq/combobox": "0.0.2",
|
||||
"@signozhq/command": "0.0.0",
|
||||
"@signozhq/design-tokens": "1.1.4",
|
||||
"@signozhq/input": "0.0.2",
|
||||
@@ -237,6 +239,7 @@
|
||||
"lint-staged": "^12.5.0",
|
||||
"msw": "1.3.2",
|
||||
"npm-run-all": "latest",
|
||||
"orval": "7.18.0",
|
||||
"portfinder-sync": "^0.0.2",
|
||||
"postcss": "8.4.38",
|
||||
"prettier": "2.2.1",
|
||||
|
||||
14
frontend/scripts/post-types-generation.sh
Executable file
14
frontend/scripts/post-types-generation.sh
Executable file
@@ -0,0 +1,14 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Rename tag files to index.ts in services directories
|
||||
# tags-split creates: services/tagName/tagName.ts -> rename to services/tagName/index.ts
|
||||
find src/api/generated/services -mindepth 1 -maxdepth 1 -type d | while read -r dir; do
|
||||
dirname=$(basename "$dir")
|
||||
tagfile="$dir/$dirname.ts"
|
||||
if [ -f "$tagfile" ]; then
|
||||
mv "$tagfile" "$dir/index.ts"
|
||||
echo "Renamed $tagfile -> $dir/index.ts"
|
||||
fi
|
||||
done
|
||||
|
||||
echo "Tag files renamed to index.ts"
|
||||
378
frontend/src/api/generated/services/authdomains/index.ts
Normal file
378
frontend/src/api/generated/services/authdomains/index.ts
Normal file
@@ -0,0 +1,378 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
AuthtypesPostableAuthDomainDTO,
|
||||
AuthtypesUpdateableAuthDomainDTO,
|
||||
CreateAuthDomain200,
|
||||
DeleteAuthDomainPathParameters,
|
||||
ListAuthDomains200,
|
||||
RenderErrorResponseDTO,
|
||||
UpdateAuthDomainPathParameters,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint lists all auth domains
|
||||
* @summary List all auth domains
|
||||
*/
|
||||
export const listAuthDomains = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<ListAuthDomains200>({
|
||||
url: `/api/v1/domains`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getListAuthDomainsQueryKey = () => {
|
||||
return ['listAuthDomains'] as const;
|
||||
};
|
||||
|
||||
export const getListAuthDomainsQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof listAuthDomains>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listAuthDomains>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getListAuthDomainsQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof listAuthDomains>>> = ({
|
||||
signal,
|
||||
}) => listAuthDomains(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listAuthDomains>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type ListAuthDomainsQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof listAuthDomains>>
|
||||
>;
|
||||
export type ListAuthDomainsQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary List all auth domains
|
||||
*/
|
||||
|
||||
export function useListAuthDomains<
|
||||
TData = Awaited<ReturnType<typeof listAuthDomains>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listAuthDomains>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getListAuthDomainsQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary List all auth domains
|
||||
*/
|
||||
export const invalidateListAuthDomains = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getListAuthDomainsQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint creates an auth domain
|
||||
* @summary Create auth domain
|
||||
*/
|
||||
export const createAuthDomain = (
|
||||
authtypesPostableAuthDomainDTO: AuthtypesPostableAuthDomainDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<CreateAuthDomain200>({
|
||||
url: `/api/v1/domains`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: authtypesPostableAuthDomainDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateAuthDomainMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createAuthDomain>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableAuthDomainDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createAuthDomain>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableAuthDomainDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['createAuthDomain'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof createAuthDomain>>,
|
||||
{ data: AuthtypesPostableAuthDomainDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return createAuthDomain(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type CreateAuthDomainMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createAuthDomain>>
|
||||
>;
|
||||
export type CreateAuthDomainMutationBody = AuthtypesPostableAuthDomainDTO;
|
||||
export type CreateAuthDomainMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create auth domain
|
||||
*/
|
||||
export const useCreateAuthDomain = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createAuthDomain>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableAuthDomainDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof createAuthDomain>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableAuthDomainDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getCreateAuthDomainMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint deletes an auth domain
|
||||
* @summary Delete auth domain
|
||||
*/
|
||||
export const deleteAuthDomain = ({ id }: DeleteAuthDomainPathParameters) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v1/domains/${id}`,
|
||||
method: 'DELETE',
|
||||
});
|
||||
};
|
||||
|
||||
export const getDeleteAuthDomainMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteAuthDomain>>,
|
||||
TError,
|
||||
{ pathParams: DeleteAuthDomainPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteAuthDomain>>,
|
||||
TError,
|
||||
{ pathParams: DeleteAuthDomainPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['deleteAuthDomain'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof deleteAuthDomain>>,
|
||||
{ pathParams: DeleteAuthDomainPathParameters }
|
||||
> = (props) => {
|
||||
const { pathParams } = props ?? {};
|
||||
|
||||
return deleteAuthDomain(pathParams);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type DeleteAuthDomainMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof deleteAuthDomain>>
|
||||
>;
|
||||
|
||||
export type DeleteAuthDomainMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Delete auth domain
|
||||
*/
|
||||
export const useDeleteAuthDomain = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteAuthDomain>>,
|
||||
TError,
|
||||
{ pathParams: DeleteAuthDomainPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof deleteAuthDomain>>,
|
||||
TError,
|
||||
{ pathParams: DeleteAuthDomainPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getDeleteAuthDomainMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint updates an auth domain
|
||||
* @summary Update auth domain
|
||||
*/
|
||||
export const updateAuthDomain = (
|
||||
{ id }: UpdateAuthDomainPathParameters,
|
||||
authtypesUpdateableAuthDomainDTO: AuthtypesUpdateableAuthDomainDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v1/domains/${id}`,
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: authtypesUpdateableAuthDomainDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateAuthDomainMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateAuthDomain>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateAuthDomainPathParameters;
|
||||
data: AuthtypesUpdateableAuthDomainDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateAuthDomain>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateAuthDomainPathParameters;
|
||||
data: AuthtypesUpdateableAuthDomainDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateAuthDomain'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateAuthDomain>>,
|
||||
{
|
||||
pathParams: UpdateAuthDomainPathParameters;
|
||||
data: AuthtypesUpdateableAuthDomainDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updateAuthDomain(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateAuthDomainMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateAuthDomain>>
|
||||
>;
|
||||
export type UpdateAuthDomainMutationBody = AuthtypesUpdateableAuthDomainDTO;
|
||||
export type UpdateAuthDomainMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update auth domain
|
||||
*/
|
||||
export const useUpdateAuthDomain = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateAuthDomain>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateAuthDomainPathParameters;
|
||||
data: AuthtypesUpdateableAuthDomainDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateAuthDomain>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateAuthDomainPathParameters;
|
||||
data: AuthtypesUpdateableAuthDomainDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateAuthDomainMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
632
frontend/src/api/generated/services/dashboard/index.ts
Normal file
632
frontend/src/api/generated/services/dashboard/index.ts
Normal file
@@ -0,0 +1,632 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
CreatePublicDashboard201,
|
||||
CreatePublicDashboardPathParameters,
|
||||
DashboardtypesPostablePublicDashboardDTO,
|
||||
DashboardtypesUpdatablePublicDashboardDTO,
|
||||
DeletePublicDashboardPathParameters,
|
||||
GetPublicDashboard200,
|
||||
GetPublicDashboardData200,
|
||||
GetPublicDashboardDataPathParameters,
|
||||
GetPublicDashboardPathParameters,
|
||||
GetPublicDashboardWidgetQueryRange200,
|
||||
GetPublicDashboardWidgetQueryRangePathParameters,
|
||||
RenderErrorResponseDTO,
|
||||
UpdatePublicDashboardPathParameters,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoints deletes the public sharing config and disables the public sharing of a dashboard
|
||||
* @summary Delete public dashboard
|
||||
*/
|
||||
export const deletePublicDashboard = ({
|
||||
id,
|
||||
}: DeletePublicDashboardPathParameters) => {
|
||||
return GeneratedAPIInstance<string>({
|
||||
url: `/api/v1/dashboards/${id}/public`,
|
||||
method: 'DELETE',
|
||||
});
|
||||
};
|
||||
|
||||
export const getDeletePublicDashboardMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deletePublicDashboard>>,
|
||||
TError,
|
||||
{ pathParams: DeletePublicDashboardPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deletePublicDashboard>>,
|
||||
TError,
|
||||
{ pathParams: DeletePublicDashboardPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['deletePublicDashboard'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof deletePublicDashboard>>,
|
||||
{ pathParams: DeletePublicDashboardPathParameters }
|
||||
> = (props) => {
|
||||
const { pathParams } = props ?? {};
|
||||
|
||||
return deletePublicDashboard(pathParams);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type DeletePublicDashboardMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof deletePublicDashboard>>
|
||||
>;
|
||||
|
||||
export type DeletePublicDashboardMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Delete public dashboard
|
||||
*/
|
||||
export const useDeletePublicDashboard = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deletePublicDashboard>>,
|
||||
TError,
|
||||
{ pathParams: DeletePublicDashboardPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof deletePublicDashboard>>,
|
||||
TError,
|
||||
{ pathParams: DeletePublicDashboardPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getDeletePublicDashboardMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoints returns public sharing config for a dashboard
|
||||
* @summary Get public dashboard
|
||||
*/
|
||||
export const getPublicDashboard = (
|
||||
{ id }: GetPublicDashboardPathParameters,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetPublicDashboard200>({
|
||||
url: `/api/v1/dashboards/${id}/public`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetPublicDashboardQueryKey = ({
|
||||
id,
|
||||
}: GetPublicDashboardPathParameters) => {
|
||||
return ['getPublicDashboard'] as const;
|
||||
};
|
||||
|
||||
export const getGetPublicDashboardQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getPublicDashboard>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ id }: GetPublicDashboardPathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboard>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetPublicDashboardQueryKey({ id });
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getPublicDashboard>>
|
||||
> = ({ signal }) => getPublicDashboard({ id }, signal);
|
||||
|
||||
return {
|
||||
queryKey,
|
||||
queryFn,
|
||||
enabled: !!id,
|
||||
...queryOptions,
|
||||
} as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboard>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetPublicDashboardQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getPublicDashboard>>
|
||||
>;
|
||||
export type GetPublicDashboardQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get public dashboard
|
||||
*/
|
||||
|
||||
export function useGetPublicDashboard<
|
||||
TData = Awaited<ReturnType<typeof getPublicDashboard>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ id }: GetPublicDashboardPathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboard>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetPublicDashboardQueryOptions({ id }, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get public dashboard
|
||||
*/
|
||||
export const invalidateGetPublicDashboard = async (
|
||||
queryClient: QueryClient,
|
||||
{ id }: GetPublicDashboardPathParameters,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetPublicDashboardQueryKey({ id }) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoints creates public sharing config and enables public sharing of the dashboard
|
||||
* @summary Create public dashboard
|
||||
*/
|
||||
export const createPublicDashboard = (
|
||||
{ id }: CreatePublicDashboardPathParameters,
|
||||
dashboardtypesPostablePublicDashboardDTO: DashboardtypesPostablePublicDashboardDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<CreatePublicDashboard201>({
|
||||
url: `/api/v1/dashboards/${id}/public`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: dashboardtypesPostablePublicDashboardDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreatePublicDashboardMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createPublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreatePublicDashboardPathParameters;
|
||||
data: DashboardtypesPostablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createPublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreatePublicDashboardPathParameters;
|
||||
data: DashboardtypesPostablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['createPublicDashboard'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof createPublicDashboard>>,
|
||||
{
|
||||
pathParams: CreatePublicDashboardPathParameters;
|
||||
data: DashboardtypesPostablePublicDashboardDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return createPublicDashboard(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type CreatePublicDashboardMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createPublicDashboard>>
|
||||
>;
|
||||
export type CreatePublicDashboardMutationBody = DashboardtypesPostablePublicDashboardDTO;
|
||||
export type CreatePublicDashboardMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create public dashboard
|
||||
*/
|
||||
export const useCreatePublicDashboard = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createPublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreatePublicDashboardPathParameters;
|
||||
data: DashboardtypesPostablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof createPublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreatePublicDashboardPathParameters;
|
||||
data: DashboardtypesPostablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getCreatePublicDashboardMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoints updates the public sharing config for a dashboard
|
||||
* @summary Update public dashboard
|
||||
*/
|
||||
export const updatePublicDashboard = (
|
||||
{ id }: UpdatePublicDashboardPathParameters,
|
||||
dashboardtypesUpdatablePublicDashboardDTO: DashboardtypesUpdatablePublicDashboardDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<string>({
|
||||
url: `/api/v1/dashboards/${id}/public`,
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: dashboardtypesUpdatablePublicDashboardDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdatePublicDashboardMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updatePublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdatePublicDashboardPathParameters;
|
||||
data: DashboardtypesUpdatablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updatePublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdatePublicDashboardPathParameters;
|
||||
data: DashboardtypesUpdatablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updatePublicDashboard'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updatePublicDashboard>>,
|
||||
{
|
||||
pathParams: UpdatePublicDashboardPathParameters;
|
||||
data: DashboardtypesUpdatablePublicDashboardDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updatePublicDashboard(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdatePublicDashboardMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updatePublicDashboard>>
|
||||
>;
|
||||
export type UpdatePublicDashboardMutationBody = DashboardtypesUpdatablePublicDashboardDTO;
|
||||
export type UpdatePublicDashboardMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update public dashboard
|
||||
*/
|
||||
export const useUpdatePublicDashboard = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updatePublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdatePublicDashboardPathParameters;
|
||||
data: DashboardtypesUpdatablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updatePublicDashboard>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdatePublicDashboardPathParameters;
|
||||
data: DashboardtypesUpdatablePublicDashboardDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdatePublicDashboardMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoints returns the sanitized dashboard data for public access
|
||||
* @summary Get public dashboard data
|
||||
*/
|
||||
export const getPublicDashboardData = (
|
||||
{ id }: GetPublicDashboardDataPathParameters,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetPublicDashboardData200>({
|
||||
url: `/api/v1/public/dashboards/${id}`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetPublicDashboardDataQueryKey = ({
|
||||
id,
|
||||
}: GetPublicDashboardDataPathParameters) => {
|
||||
return ['getPublicDashboardData'] as const;
|
||||
};
|
||||
|
||||
export const getGetPublicDashboardDataQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getPublicDashboardData>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ id }: GetPublicDashboardDataPathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboardData>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetPublicDashboardDataQueryKey({ id });
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getPublicDashboardData>>
|
||||
> = ({ signal }) => getPublicDashboardData({ id }, signal);
|
||||
|
||||
return {
|
||||
queryKey,
|
||||
queryFn,
|
||||
enabled: !!id,
|
||||
...queryOptions,
|
||||
} as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboardData>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetPublicDashboardDataQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getPublicDashboardData>>
|
||||
>;
|
||||
export type GetPublicDashboardDataQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get public dashboard data
|
||||
*/
|
||||
|
||||
export function useGetPublicDashboardData<
|
||||
TData = Awaited<ReturnType<typeof getPublicDashboardData>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ id }: GetPublicDashboardDataPathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboardData>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetPublicDashboardDataQueryOptions({ id }, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get public dashboard data
|
||||
*/
|
||||
export const invalidateGetPublicDashboardData = async (
|
||||
queryClient: QueryClient,
|
||||
{ id }: GetPublicDashboardDataPathParameters,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetPublicDashboardDataQueryKey({ id }) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint return query range results for a widget of public dashboard
|
||||
* @summary Get query range result
|
||||
*/
|
||||
export const getPublicDashboardWidgetQueryRange = (
|
||||
{ id, idx }: GetPublicDashboardWidgetQueryRangePathParameters,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetPublicDashboardWidgetQueryRange200>({
|
||||
url: `/api/v1/public/dashboards/${id}/widgets/${idx}/query_range`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetPublicDashboardWidgetQueryRangeQueryKey = ({
|
||||
id,
|
||||
idx,
|
||||
}: GetPublicDashboardWidgetQueryRangePathParameters) => {
|
||||
return ['getPublicDashboardWidgetQueryRange'] as const;
|
||||
};
|
||||
|
||||
export const getGetPublicDashboardWidgetQueryRangeQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ id, idx }: GetPublicDashboardWidgetQueryRangePathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ??
|
||||
getGetPublicDashboardWidgetQueryRangeQueryKey({ id, idx });
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>
|
||||
> = ({ signal }) => getPublicDashboardWidgetQueryRange({ id, idx }, signal);
|
||||
|
||||
return {
|
||||
queryKey,
|
||||
queryFn,
|
||||
enabled: !!(id && idx),
|
||||
...queryOptions,
|
||||
} as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetPublicDashboardWidgetQueryRangeQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>
|
||||
>;
|
||||
export type GetPublicDashboardWidgetQueryRangeQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get query range result
|
||||
*/
|
||||
|
||||
export function useGetPublicDashboardWidgetQueryRange<
|
||||
TData = Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ id, idx }: GetPublicDashboardWidgetQueryRangePathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getPublicDashboardWidgetQueryRange>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetPublicDashboardWidgetQueryRangeQueryOptions(
|
||||
{ id, idx },
|
||||
options,
|
||||
);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get query range result
|
||||
*/
|
||||
export const invalidateGetPublicDashboardWidgetQueryRange = async (
|
||||
queryClient: QueryClient,
|
||||
{ id, idx }: GetPublicDashboardWidgetQueryRangePathParameters,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetPublicDashboardWidgetQueryRangeQueryKey({ id, idx }) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
109
frontend/src/api/generated/services/features/index.ts
Normal file
109
frontend/src/api/generated/services/features/index.ts
Normal file
@@ -0,0 +1,109 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type { GetFeatures200, RenderErrorResponseDTO } from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint returns the supported features and their details
|
||||
* @summary Get features
|
||||
*/
|
||||
export const getFeatures = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<GetFeatures200>({
|
||||
url: `/api/v2/features`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetFeaturesQueryKey = () => {
|
||||
return ['getFeatures'] as const;
|
||||
};
|
||||
|
||||
export const getGetFeaturesQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getFeatures>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getFeatures>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getGetFeaturesQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof getFeatures>>> = ({
|
||||
signal,
|
||||
}) => getFeatures(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getFeatures>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetFeaturesQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getFeatures>>
|
||||
>;
|
||||
export type GetFeaturesQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get features
|
||||
*/
|
||||
|
||||
export function useGetFeatures<
|
||||
TData = Awaited<ReturnType<typeof getFeatures>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getFeatures>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetFeaturesQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get features
|
||||
*/
|
||||
export const invalidateGetFeatures = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetFeaturesQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
746
frontend/src/api/generated/services/gateway/index.ts
Normal file
746
frontend/src/api/generated/services/gateway/index.ts
Normal file
@@ -0,0 +1,746 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
CreateIngestionKey200,
|
||||
CreateIngestionKeyLimit201,
|
||||
CreateIngestionKeyLimitPathParameters,
|
||||
DeleteIngestionKeyLimitPathParameters,
|
||||
DeleteIngestionKeyPathParameters,
|
||||
GatewaytypesPostableIngestionKeyDTO,
|
||||
GatewaytypesPostableIngestionKeyLimitDTO,
|
||||
GatewaytypesUpdatableIngestionKeyLimitDTO,
|
||||
GetIngestionKeys200,
|
||||
RenderErrorResponseDTO,
|
||||
SearchIngestionKeys200,
|
||||
UpdateIngestionKeyLimitPathParameters,
|
||||
UpdateIngestionKeyPathParameters,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint returns the ingestion keys for a workspace
|
||||
* @summary Get ingestion keys for workspace
|
||||
*/
|
||||
export const getIngestionKeys = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<GetIngestionKeys200>({
|
||||
url: `/api/v2/gateway/ingestion_keys`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetIngestionKeysQueryKey = () => {
|
||||
return ['getIngestionKeys'] as const;
|
||||
};
|
||||
|
||||
export const getGetIngestionKeysQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getIngestionKeys>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getIngestionKeys>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getGetIngestionKeysQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof getIngestionKeys>>> = ({
|
||||
signal,
|
||||
}) => getIngestionKeys(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getIngestionKeys>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetIngestionKeysQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getIngestionKeys>>
|
||||
>;
|
||||
export type GetIngestionKeysQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get ingestion keys for workspace
|
||||
*/
|
||||
|
||||
export function useGetIngestionKeys<
|
||||
TData = Awaited<ReturnType<typeof getIngestionKeys>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getIngestionKeys>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetIngestionKeysQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get ingestion keys for workspace
|
||||
*/
|
||||
export const invalidateGetIngestionKeys = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetIngestionKeysQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint creates an ingestion key for the workspace
|
||||
* @summary Create ingestion key for workspace
|
||||
*/
|
||||
export const createIngestionKey = (
|
||||
gatewaytypesPostableIngestionKeyDTO: GatewaytypesPostableIngestionKeyDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<CreateIngestionKey200>({
|
||||
url: `/api/v2/gateway/ingestion_keys`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: gatewaytypesPostableIngestionKeyDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateIngestionKeyMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createIngestionKey>>,
|
||||
TError,
|
||||
{ data: GatewaytypesPostableIngestionKeyDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createIngestionKey>>,
|
||||
TError,
|
||||
{ data: GatewaytypesPostableIngestionKeyDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['createIngestionKey'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof createIngestionKey>>,
|
||||
{ data: GatewaytypesPostableIngestionKeyDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return createIngestionKey(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type CreateIngestionKeyMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createIngestionKey>>
|
||||
>;
|
||||
export type CreateIngestionKeyMutationBody = GatewaytypesPostableIngestionKeyDTO;
|
||||
export type CreateIngestionKeyMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create ingestion key for workspace
|
||||
*/
|
||||
export const useCreateIngestionKey = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createIngestionKey>>,
|
||||
TError,
|
||||
{ data: GatewaytypesPostableIngestionKeyDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof createIngestionKey>>,
|
||||
TError,
|
||||
{ data: GatewaytypesPostableIngestionKeyDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getCreateIngestionKeyMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint deletes an ingestion key for the workspace
|
||||
* @summary Delete ingestion key for workspace
|
||||
*/
|
||||
export const deleteIngestionKey = ({
|
||||
keyId,
|
||||
}: DeleteIngestionKeyPathParameters) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v2/gateway/ingestion_keys/${keyId}`,
|
||||
method: 'DELETE',
|
||||
});
|
||||
};
|
||||
|
||||
export const getDeleteIngestionKeyMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteIngestionKey>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteIngestionKey>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['deleteIngestionKey'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof deleteIngestionKey>>,
|
||||
{ pathParams: DeleteIngestionKeyPathParameters }
|
||||
> = (props) => {
|
||||
const { pathParams } = props ?? {};
|
||||
|
||||
return deleteIngestionKey(pathParams);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type DeleteIngestionKeyMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof deleteIngestionKey>>
|
||||
>;
|
||||
|
||||
export type DeleteIngestionKeyMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Delete ingestion key for workspace
|
||||
*/
|
||||
export const useDeleteIngestionKey = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteIngestionKey>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof deleteIngestionKey>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getDeleteIngestionKeyMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint updates an ingestion key for the workspace
|
||||
* @summary Update ingestion key for workspace
|
||||
*/
|
||||
export const updateIngestionKey = (
|
||||
{ keyId }: UpdateIngestionKeyPathParameters,
|
||||
gatewaytypesPostableIngestionKeyDTO: GatewaytypesPostableIngestionKeyDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v2/gateway/ingestion_keys/${keyId}`,
|
||||
method: 'PATCH',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: gatewaytypesPostableIngestionKeyDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateIngestionKeyMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateIngestionKey>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateIngestionKey>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateIngestionKey'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateIngestionKey>>,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updateIngestionKey(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateIngestionKeyMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateIngestionKey>>
|
||||
>;
|
||||
export type UpdateIngestionKeyMutationBody = GatewaytypesPostableIngestionKeyDTO;
|
||||
export type UpdateIngestionKeyMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update ingestion key for workspace
|
||||
*/
|
||||
export const useUpdateIngestionKey = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateIngestionKey>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateIngestionKey>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateIngestionKeyMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint creates an ingestion key limit
|
||||
* @summary Create limit for the ingestion key
|
||||
*/
|
||||
export const createIngestionKeyLimit = (
|
||||
{ keyId }: CreateIngestionKeyLimitPathParameters,
|
||||
gatewaytypesPostableIngestionKeyLimitDTO: GatewaytypesPostableIngestionKeyLimitDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<CreateIngestionKeyLimit201>({
|
||||
url: `/api/v2/gateway/ingestion_keys/${keyId}/limits`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: gatewaytypesPostableIngestionKeyLimitDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateIngestionKeyLimitMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['createIngestionKeyLimit'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof createIngestionKeyLimit>>,
|
||||
{
|
||||
pathParams: CreateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyLimitDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return createIngestionKeyLimit(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type CreateIngestionKeyLimitMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createIngestionKeyLimit>>
|
||||
>;
|
||||
export type CreateIngestionKeyLimitMutationBody = GatewaytypesPostableIngestionKeyLimitDTO;
|
||||
export type CreateIngestionKeyLimitMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create limit for the ingestion key
|
||||
*/
|
||||
export const useCreateIngestionKeyLimit = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof createIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: CreateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesPostableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getCreateIngestionKeyLimitMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint deletes an ingestion key limit
|
||||
* @summary Delete limit for the ingestion key
|
||||
*/
|
||||
export const deleteIngestionKeyLimit = ({
|
||||
limitId,
|
||||
}: DeleteIngestionKeyLimitPathParameters) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v2/gateway/ingestion_keys/limits/${limitId}`,
|
||||
method: 'DELETE',
|
||||
});
|
||||
};
|
||||
|
||||
export const getDeleteIngestionKeyLimitMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteIngestionKeyLimit>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyLimitPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteIngestionKeyLimit>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyLimitPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['deleteIngestionKeyLimit'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof deleteIngestionKeyLimit>>,
|
||||
{ pathParams: DeleteIngestionKeyLimitPathParameters }
|
||||
> = (props) => {
|
||||
const { pathParams } = props ?? {};
|
||||
|
||||
return deleteIngestionKeyLimit(pathParams);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type DeleteIngestionKeyLimitMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof deleteIngestionKeyLimit>>
|
||||
>;
|
||||
|
||||
export type DeleteIngestionKeyLimitMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Delete limit for the ingestion key
|
||||
*/
|
||||
export const useDeleteIngestionKeyLimit = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteIngestionKeyLimit>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyLimitPathParameters },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof deleteIngestionKeyLimit>>,
|
||||
TError,
|
||||
{ pathParams: DeleteIngestionKeyLimitPathParameters },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getDeleteIngestionKeyLimitMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint updates an ingestion key limit
|
||||
* @summary Update limit for the ingestion key
|
||||
*/
|
||||
export const updateIngestionKeyLimit = (
|
||||
{ limitId }: UpdateIngestionKeyLimitPathParameters,
|
||||
gatewaytypesUpdatableIngestionKeyLimitDTO: GatewaytypesUpdatableIngestionKeyLimitDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v2/gateway/ingestion_keys/limits/${limitId}`,
|
||||
method: 'PATCH',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: gatewaytypesUpdatableIngestionKeyLimitDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateIngestionKeyLimitMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesUpdatableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesUpdatableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateIngestionKeyLimit'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateIngestionKeyLimit>>,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesUpdatableIngestionKeyLimitDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updateIngestionKeyLimit(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateIngestionKeyLimitMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateIngestionKeyLimit>>
|
||||
>;
|
||||
export type UpdateIngestionKeyLimitMutationBody = GatewaytypesUpdatableIngestionKeyLimitDTO;
|
||||
export type UpdateIngestionKeyLimitMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update limit for the ingestion key
|
||||
*/
|
||||
export const useUpdateIngestionKeyLimit = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesUpdatableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateIngestionKeyLimit>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateIngestionKeyLimitPathParameters;
|
||||
data: GatewaytypesUpdatableIngestionKeyLimitDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateIngestionKeyLimitMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint returns the ingestion keys for a workspace
|
||||
* @summary Search ingestion keys for workspace
|
||||
*/
|
||||
export const searchIngestionKeys = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<SearchIngestionKeys200>({
|
||||
url: `/api/v2/gateway/ingestion_keys/search`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getSearchIngestionKeysQueryKey = () => {
|
||||
return ['searchIngestionKeys'] as const;
|
||||
};
|
||||
|
||||
export const getSearchIngestionKeysQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof searchIngestionKeys>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof searchIngestionKeys>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getSearchIngestionKeysQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof searchIngestionKeys>>
|
||||
> = ({ signal }) => searchIngestionKeys(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof searchIngestionKeys>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type SearchIngestionKeysQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof searchIngestionKeys>>
|
||||
>;
|
||||
export type SearchIngestionKeysQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Search ingestion keys for workspace
|
||||
*/
|
||||
|
||||
export function useSearchIngestionKeys<
|
||||
TData = Awaited<ReturnType<typeof searchIngestionKeys>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof searchIngestionKeys>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getSearchIngestionKeysQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Search ingestion keys for workspace
|
||||
*/
|
||||
export const invalidateSearchIngestionKeys = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getSearchIngestionKeysQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
112
frontend/src/api/generated/services/global/index.ts
Normal file
112
frontend/src/api/generated/services/global/index.ts
Normal file
@@ -0,0 +1,112 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
GetGlobalConfig200,
|
||||
RenderErrorResponseDTO,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoints returns global config
|
||||
* @summary Get global config
|
||||
*/
|
||||
export const getGlobalConfig = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<GetGlobalConfig200>({
|
||||
url: `/api/v1/global/config`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetGlobalConfigQueryKey = () => {
|
||||
return ['getGlobalConfig'] as const;
|
||||
};
|
||||
|
||||
export const getGetGlobalConfigQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getGlobalConfig>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getGlobalConfig>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getGetGlobalConfigQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof getGlobalConfig>>> = ({
|
||||
signal,
|
||||
}) => getGlobalConfig(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getGlobalConfig>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetGlobalConfigQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getGlobalConfig>>
|
||||
>;
|
||||
export type GetGlobalConfigQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get global config
|
||||
*/
|
||||
|
||||
export function useGetGlobalConfig<
|
||||
TData = Awaited<ReturnType<typeof getGlobalConfig>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getGlobalConfig>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetGlobalConfigQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get global config
|
||||
*/
|
||||
export const invalidateGetGlobalConfig = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetGlobalConfigQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
203
frontend/src/api/generated/services/logs/index.ts
Normal file
203
frontend/src/api/generated/services/logs/index.ts
Normal file
@@ -0,0 +1,203 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
ListPromotedAndIndexedPaths200,
|
||||
PromotetypesPromotePathDTO,
|
||||
RenderErrorResponseDTO,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoints promotes and indexes paths
|
||||
* @summary Promote and index paths
|
||||
*/
|
||||
export const listPromotedAndIndexedPaths = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<ListPromotedAndIndexedPaths200>({
|
||||
url: `/api/v1/logs/promote_paths`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getListPromotedAndIndexedPathsQueryKey = () => {
|
||||
return ['listPromotedAndIndexedPaths'] as const;
|
||||
};
|
||||
|
||||
export const getListPromotedAndIndexedPathsQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getListPromotedAndIndexedPathsQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>
|
||||
> = ({ signal }) => listPromotedAndIndexedPaths(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type ListPromotedAndIndexedPathsQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>
|
||||
>;
|
||||
export type ListPromotedAndIndexedPathsQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Promote and index paths
|
||||
*/
|
||||
|
||||
export function useListPromotedAndIndexedPaths<
|
||||
TData = Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listPromotedAndIndexedPaths>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getListPromotedAndIndexedPathsQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Promote and index paths
|
||||
*/
|
||||
export const invalidateListPromotedAndIndexedPaths = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getListPromotedAndIndexedPathsQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoints promotes and indexes paths
|
||||
* @summary Promote and index paths
|
||||
*/
|
||||
export const handlePromoteAndIndexPaths = (
|
||||
promotetypesPromotePathDTONull: PromotetypesPromotePathDTO[] | null,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v1/logs/promote_paths`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: promotetypesPromotePathDTONull,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getHandlePromoteAndIndexPathsMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof handlePromoteAndIndexPaths>>,
|
||||
TError,
|
||||
{ data: PromotetypesPromotePathDTO[] | null },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof handlePromoteAndIndexPaths>>,
|
||||
TError,
|
||||
{ data: PromotetypesPromotePathDTO[] | null },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['handlePromoteAndIndexPaths'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof handlePromoteAndIndexPaths>>,
|
||||
{ data: PromotetypesPromotePathDTO[] | null }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return handlePromoteAndIndexPaths(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type HandlePromoteAndIndexPathsMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof handlePromoteAndIndexPaths>>
|
||||
>;
|
||||
export type HandlePromoteAndIndexPathsMutationBody =
|
||||
| PromotetypesPromotePathDTO[]
|
||||
| null;
|
||||
export type HandlePromoteAndIndexPathsMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Promote and index paths
|
||||
*/
|
||||
export const useHandlePromoteAndIndexPaths = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof handlePromoteAndIndexPaths>>,
|
||||
TError,
|
||||
{ data: PromotetypesPromotePathDTO[] | null },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof handlePromoteAndIndexPaths>>,
|
||||
TError,
|
||||
{ data: PromotetypesPromotePathDTO[] | null },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getHandlePromoteAndIndexPathsMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
790
frontend/src/api/generated/services/metrics/index.ts
Normal file
790
frontend/src/api/generated/services/metrics/index.ts
Normal file
@@ -0,0 +1,790 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
GetMetricAlerts200,
|
||||
GetMetricAlertsParams,
|
||||
GetMetricAttributes200,
|
||||
GetMetricDashboards200,
|
||||
GetMetricDashboardsParams,
|
||||
GetMetricHighlights200,
|
||||
GetMetricHighlightsParams,
|
||||
GetMetricMetadata200,
|
||||
GetMetricMetadataParams,
|
||||
GetMetricsStats200,
|
||||
GetMetricsTreemap200,
|
||||
MetricsexplorertypesMetricAttributesRequestDTO,
|
||||
MetricsexplorertypesStatsRequestDTO,
|
||||
MetricsexplorertypesTreemapRequestDTO,
|
||||
MetricsexplorertypesUpdateMetricMetadataRequestDTO,
|
||||
RenderErrorResponseDTO,
|
||||
UpdateMetricMetadataPathParameters,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint returns associated alerts for a specified metric
|
||||
* @summary Get metric alerts
|
||||
*/
|
||||
export const getMetricAlerts = (
|
||||
params?: GetMetricAlertsParams,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricAlerts200>({
|
||||
url: `/api/v2/metric/alerts`,
|
||||
method: 'GET',
|
||||
params,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricAlertsQueryKey = (params?: GetMetricAlertsParams) => {
|
||||
return ['getMetricAlerts', ...(params ? [params] : [])] as const;
|
||||
};
|
||||
|
||||
export const getGetMetricAlertsQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getMetricAlerts>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricAlertsParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricAlerts>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getGetMetricAlertsQueryKey(params);
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof getMetricAlerts>>> = ({
|
||||
signal,
|
||||
}) => getMetricAlerts(params, signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricAlerts>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetMetricAlertsQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricAlerts>>
|
||||
>;
|
||||
export type GetMetricAlertsQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metric alerts
|
||||
*/
|
||||
|
||||
export function useGetMetricAlerts<
|
||||
TData = Awaited<ReturnType<typeof getMetricAlerts>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricAlertsParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricAlerts>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetMetricAlertsQueryOptions(params, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get metric alerts
|
||||
*/
|
||||
export const invalidateGetMetricAlerts = async (
|
||||
queryClient: QueryClient,
|
||||
params?: GetMetricAlertsParams,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetMetricAlertsQueryKey(params) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint returns associated dashboards for a specified metric
|
||||
* @summary Get metric dashboards
|
||||
*/
|
||||
export const getMetricDashboards = (
|
||||
params?: GetMetricDashboardsParams,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricDashboards200>({
|
||||
url: `/api/v2/metric/dashboards`,
|
||||
method: 'GET',
|
||||
params,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricDashboardsQueryKey = (
|
||||
params?: GetMetricDashboardsParams,
|
||||
) => {
|
||||
return ['getMetricDashboards', ...(params ? [params] : [])] as const;
|
||||
};
|
||||
|
||||
export const getGetMetricDashboardsQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getMetricDashboards>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricDashboardsParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricDashboards>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetMetricDashboardsQueryKey(params);
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getMetricDashboards>>
|
||||
> = ({ signal }) => getMetricDashboards(params, signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricDashboards>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetMetricDashboardsQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricDashboards>>
|
||||
>;
|
||||
export type GetMetricDashboardsQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metric dashboards
|
||||
*/
|
||||
|
||||
export function useGetMetricDashboards<
|
||||
TData = Awaited<ReturnType<typeof getMetricDashboards>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricDashboardsParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricDashboards>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetMetricDashboardsQueryOptions(params, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get metric dashboards
|
||||
*/
|
||||
export const invalidateGetMetricDashboards = async (
|
||||
queryClient: QueryClient,
|
||||
params?: GetMetricDashboardsParams,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetMetricDashboardsQueryKey(params) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint returns highlights like number of datapoints, totaltimeseries, active time series, last received time for a specified metric
|
||||
* @summary Get metric highlights
|
||||
*/
|
||||
export const getMetricHighlights = (
|
||||
params?: GetMetricHighlightsParams,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricHighlights200>({
|
||||
url: `/api/v2/metric/highlights`,
|
||||
method: 'GET',
|
||||
params,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricHighlightsQueryKey = (
|
||||
params?: GetMetricHighlightsParams,
|
||||
) => {
|
||||
return ['getMetricHighlights', ...(params ? [params] : [])] as const;
|
||||
};
|
||||
|
||||
export const getGetMetricHighlightsQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getMetricHighlights>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricHighlightsParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricHighlights>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetMetricHighlightsQueryKey(params);
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getMetricHighlights>>
|
||||
> = ({ signal }) => getMetricHighlights(params, signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricHighlights>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetMetricHighlightsQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricHighlights>>
|
||||
>;
|
||||
export type GetMetricHighlightsQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metric highlights
|
||||
*/
|
||||
|
||||
export function useGetMetricHighlights<
|
||||
TData = Awaited<ReturnType<typeof getMetricHighlights>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricHighlightsParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricHighlights>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetMetricHighlightsQueryOptions(params, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get metric highlights
|
||||
*/
|
||||
export const invalidateGetMetricHighlights = async (
|
||||
queryClient: QueryClient,
|
||||
params?: GetMetricHighlightsParams,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetMetricHighlightsQueryKey(params) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint helps to update metadata information like metric description, unit, type, temporality, monotonicity for a specified metric
|
||||
* @summary Update metric metadata
|
||||
*/
|
||||
export const updateMetricMetadata = (
|
||||
{ metricName }: UpdateMetricMetadataPathParameters,
|
||||
metricsexplorertypesUpdateMetricMetadataRequestDTO: MetricsexplorertypesUpdateMetricMetadataRequestDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<string>({
|
||||
url: `/api/v2/metrics/${metricName}/metadata`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: metricsexplorertypesUpdateMetricMetadataRequestDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateMetricMetadataMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateMetricMetadata>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateMetricMetadataPathParameters;
|
||||
data: MetricsexplorertypesUpdateMetricMetadataRequestDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateMetricMetadata>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateMetricMetadataPathParameters;
|
||||
data: MetricsexplorertypesUpdateMetricMetadataRequestDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateMetricMetadata'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateMetricMetadata>>,
|
||||
{
|
||||
pathParams: UpdateMetricMetadataPathParameters;
|
||||
data: MetricsexplorertypesUpdateMetricMetadataRequestDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updateMetricMetadata(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateMetricMetadataMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateMetricMetadata>>
|
||||
>;
|
||||
export type UpdateMetricMetadataMutationBody = MetricsexplorertypesUpdateMetricMetadataRequestDTO;
|
||||
export type UpdateMetricMetadataMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update metric metadata
|
||||
*/
|
||||
export const useUpdateMetricMetadata = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateMetricMetadata>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateMetricMetadataPathParameters;
|
||||
data: MetricsexplorertypesUpdateMetricMetadataRequestDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateMetricMetadata>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateMetricMetadataPathParameters;
|
||||
data: MetricsexplorertypesUpdateMetricMetadataRequestDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateMetricMetadataMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint returns attribute keys and their unique values for a specified metric
|
||||
* @summary Get metric attributes
|
||||
*/
|
||||
export const getMetricAttributes = (
|
||||
metricsexplorertypesMetricAttributesRequestDTO: MetricsexplorertypesMetricAttributesRequestDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricAttributes200>({
|
||||
url: `/api/v2/metrics/attributes`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: metricsexplorertypesMetricAttributesRequestDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricAttributesMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricAttributes>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesMetricAttributesRequestDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricAttributes>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesMetricAttributesRequestDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['getMetricAttributes'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof getMetricAttributes>>,
|
||||
{ data: MetricsexplorertypesMetricAttributesRequestDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return getMetricAttributes(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type GetMetricAttributesMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricAttributes>>
|
||||
>;
|
||||
export type GetMetricAttributesMutationBody = MetricsexplorertypesMetricAttributesRequestDTO;
|
||||
export type GetMetricAttributesMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metric attributes
|
||||
*/
|
||||
export const useGetMetricAttributes = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricAttributes>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesMetricAttributesRequestDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof getMetricAttributes>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesMetricAttributesRequestDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getGetMetricAttributesMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint returns metadata information like metric description, unit, type, temporality, monotonicity for a specified metric
|
||||
* @summary Get metric metadata
|
||||
*/
|
||||
export const getMetricMetadata = (
|
||||
params?: GetMetricMetadataParams,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricMetadata200>({
|
||||
url: `/api/v2/metrics/metadata`,
|
||||
method: 'GET',
|
||||
params,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricMetadataQueryKey = (
|
||||
params?: GetMetricMetadataParams,
|
||||
) => {
|
||||
return ['getMetricMetadata', ...(params ? [params] : [])] as const;
|
||||
};
|
||||
|
||||
export const getGetMetricMetadataQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getMetricMetadata>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricMetadataParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricMetadata>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetMetricMetadataQueryKey(params);
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getMetricMetadata>>
|
||||
> = ({ signal }) => getMetricMetadata(params, signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricMetadata>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetMetricMetadataQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricMetadata>>
|
||||
>;
|
||||
export type GetMetricMetadataQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metric metadata
|
||||
*/
|
||||
|
||||
export function useGetMetricMetadata<
|
||||
TData = Awaited<ReturnType<typeof getMetricMetadata>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
params?: GetMetricMetadataParams,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMetricMetadata>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetMetricMetadataQueryOptions(params, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get metric metadata
|
||||
*/
|
||||
export const invalidateGetMetricMetadata = async (
|
||||
queryClient: QueryClient,
|
||||
params?: GetMetricMetadataParams,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetMetricMetadataQueryKey(params) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint provides list of metrics with their number of samples and timeseries for the given time range
|
||||
* @summary Get metrics statistics
|
||||
*/
|
||||
export const getMetricsStats = (
|
||||
metricsexplorertypesStatsRequestDTO: MetricsexplorertypesStatsRequestDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricsStats200>({
|
||||
url: `/api/v2/metrics/stats`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: metricsexplorertypesStatsRequestDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricsStatsMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricsStats>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesStatsRequestDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricsStats>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesStatsRequestDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['getMetricsStats'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof getMetricsStats>>,
|
||||
{ data: MetricsexplorertypesStatsRequestDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return getMetricsStats(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type GetMetricsStatsMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricsStats>>
|
||||
>;
|
||||
export type GetMetricsStatsMutationBody = MetricsexplorertypesStatsRequestDTO;
|
||||
export type GetMetricsStatsMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metrics statistics
|
||||
*/
|
||||
export const useGetMetricsStats = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricsStats>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesStatsRequestDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof getMetricsStats>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesStatsRequestDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getGetMetricsStatsMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint returns a treemap visualization showing the proportional distribution of metrics by sample count or time series count
|
||||
* @summary Get metrics treemap
|
||||
*/
|
||||
export const getMetricsTreemap = (
|
||||
metricsexplorertypesTreemapRequestDTO: MetricsexplorertypesTreemapRequestDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetMetricsTreemap200>({
|
||||
url: `/api/v2/metrics/treemap`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: metricsexplorertypesTreemapRequestDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMetricsTreemapMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricsTreemap>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesTreemapRequestDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricsTreemap>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesTreemapRequestDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['getMetricsTreemap'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof getMetricsTreemap>>,
|
||||
{ data: MetricsexplorertypesTreemapRequestDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return getMetricsTreemap(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type GetMetricsTreemapMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMetricsTreemap>>
|
||||
>;
|
||||
export type GetMetricsTreemapMutationBody = MetricsexplorertypesTreemapRequestDTO;
|
||||
export type GetMetricsTreemapMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get metrics treemap
|
||||
*/
|
||||
export const useGetMetricsTreemap = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof getMetricsTreemap>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesTreemapRequestDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof getMetricsTreemap>>,
|
||||
TError,
|
||||
{ data: MetricsexplorertypesTreemapRequestDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getGetMetricsTreemapMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
198
frontend/src/api/generated/services/orgs/index.ts
Normal file
198
frontend/src/api/generated/services/orgs/index.ts
Normal file
@@ -0,0 +1,198 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
GetMyOrganization200,
|
||||
RenderErrorResponseDTO,
|
||||
TypesOrganizationDTO,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint returns the organization I belong to
|
||||
* @summary Get my organization
|
||||
*/
|
||||
export const getMyOrganization = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<GetMyOrganization200>({
|
||||
url: `/api/v2/orgs/me`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetMyOrganizationQueryKey = () => {
|
||||
return ['getMyOrganization'] as const;
|
||||
};
|
||||
|
||||
export const getGetMyOrganizationQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getMyOrganization>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMyOrganization>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getGetMyOrganizationQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getMyOrganization>>
|
||||
> = ({ signal }) => getMyOrganization(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMyOrganization>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetMyOrganizationQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getMyOrganization>>
|
||||
>;
|
||||
export type GetMyOrganizationQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get my organization
|
||||
*/
|
||||
|
||||
export function useGetMyOrganization<
|
||||
TData = Awaited<ReturnType<typeof getMyOrganization>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getMyOrganization>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetMyOrganizationQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get my organization
|
||||
*/
|
||||
export const invalidateGetMyOrganization = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetMyOrganizationQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint updates the organization I belong to
|
||||
* @summary Update my organization
|
||||
*/
|
||||
export const updateMyOrganization = (
|
||||
typesOrganizationDTO: TypesOrganizationDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v2/orgs/me`,
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: typesOrganizationDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateMyOrganizationMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateMyOrganization>>,
|
||||
TError,
|
||||
{ data: TypesOrganizationDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateMyOrganization>>,
|
||||
TError,
|
||||
{ data: TypesOrganizationDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateMyOrganization'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateMyOrganization>>,
|
||||
{ data: TypesOrganizationDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return updateMyOrganization(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateMyOrganizationMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateMyOrganization>>
|
||||
>;
|
||||
export type UpdateMyOrganizationMutationBody = TypesOrganizationDTO;
|
||||
export type UpdateMyOrganizationMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update my organization
|
||||
*/
|
||||
export const useUpdateMyOrganization = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateMyOrganization>>,
|
||||
TError,
|
||||
{ data: TypesOrganizationDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateMyOrganization>>,
|
||||
TError,
|
||||
{ data: TypesOrganizationDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateMyOrganizationMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
612
frontend/src/api/generated/services/preferences/index.ts
Normal file
612
frontend/src/api/generated/services/preferences/index.ts
Normal file
@@ -0,0 +1,612 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
GetOrgPreference200,
|
||||
GetOrgPreferencePathParameters,
|
||||
GetUserPreference200,
|
||||
GetUserPreferencePathParameters,
|
||||
ListOrgPreferences200,
|
||||
ListUserPreferences200,
|
||||
PreferencetypesUpdatablePreferenceDTO,
|
||||
RenderErrorResponseDTO,
|
||||
UpdateOrgPreferencePathParameters,
|
||||
UpdateUserPreferencePathParameters,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint lists all org preferences
|
||||
* @summary List org preferences
|
||||
*/
|
||||
export const listOrgPreferences = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<ListOrgPreferences200>({
|
||||
url: `/api/v1/org/preferences`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getListOrgPreferencesQueryKey = () => {
|
||||
return ['listOrgPreferences'] as const;
|
||||
};
|
||||
|
||||
export const getListOrgPreferencesQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof listOrgPreferences>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listOrgPreferences>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getListOrgPreferencesQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof listOrgPreferences>>
|
||||
> = ({ signal }) => listOrgPreferences(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listOrgPreferences>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type ListOrgPreferencesQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof listOrgPreferences>>
|
||||
>;
|
||||
export type ListOrgPreferencesQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary List org preferences
|
||||
*/
|
||||
|
||||
export function useListOrgPreferences<
|
||||
TData = Awaited<ReturnType<typeof listOrgPreferences>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listOrgPreferences>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getListOrgPreferencesQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary List org preferences
|
||||
*/
|
||||
export const invalidateListOrgPreferences = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getListOrgPreferencesQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint returns the org preference by name
|
||||
* @summary Get org preference
|
||||
*/
|
||||
export const getOrgPreference = (
|
||||
{ name }: GetOrgPreferencePathParameters,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetOrgPreference200>({
|
||||
url: `/api/v1/org/preferences/${name}`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetOrgPreferenceQueryKey = ({
|
||||
name,
|
||||
}: GetOrgPreferencePathParameters) => {
|
||||
return ['getOrgPreference'] as const;
|
||||
};
|
||||
|
||||
export const getGetOrgPreferenceQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getOrgPreference>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ name }: GetOrgPreferencePathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getOrgPreference>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetOrgPreferenceQueryKey({ name });
|
||||
|
||||
const queryFn: QueryFunction<Awaited<ReturnType<typeof getOrgPreference>>> = ({
|
||||
signal,
|
||||
}) => getOrgPreference({ name }, signal);
|
||||
|
||||
return {
|
||||
queryKey,
|
||||
queryFn,
|
||||
enabled: !!name,
|
||||
...queryOptions,
|
||||
} as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getOrgPreference>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetOrgPreferenceQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getOrgPreference>>
|
||||
>;
|
||||
export type GetOrgPreferenceQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get org preference
|
||||
*/
|
||||
|
||||
export function useGetOrgPreference<
|
||||
TData = Awaited<ReturnType<typeof getOrgPreference>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ name }: GetOrgPreferencePathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getOrgPreference>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetOrgPreferenceQueryOptions({ name }, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get org preference
|
||||
*/
|
||||
export const invalidateGetOrgPreference = async (
|
||||
queryClient: QueryClient,
|
||||
{ name }: GetOrgPreferencePathParameters,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetOrgPreferenceQueryKey({ name }) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint updates the org preference by name
|
||||
* @summary Update org preference
|
||||
*/
|
||||
export const updateOrgPreference = (
|
||||
{ name }: UpdateOrgPreferencePathParameters,
|
||||
preferencetypesUpdatablePreferenceDTO: PreferencetypesUpdatablePreferenceDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v1/org/preferences/${name}`,
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: preferencetypesUpdatablePreferenceDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateOrgPreferenceMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateOrgPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateOrgPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateOrgPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateOrgPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateOrgPreference'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateOrgPreference>>,
|
||||
{
|
||||
pathParams: UpdateOrgPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updateOrgPreference(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateOrgPreferenceMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateOrgPreference>>
|
||||
>;
|
||||
export type UpdateOrgPreferenceMutationBody = PreferencetypesUpdatablePreferenceDTO;
|
||||
export type UpdateOrgPreferenceMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update org preference
|
||||
*/
|
||||
export const useUpdateOrgPreference = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateOrgPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateOrgPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateOrgPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateOrgPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateOrgPreferenceMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint lists all user preferences
|
||||
* @summary List user preferences
|
||||
*/
|
||||
export const listUserPreferences = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<ListUserPreferences200>({
|
||||
url: `/api/v1/user/preferences`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getListUserPreferencesQueryKey = () => {
|
||||
return ['listUserPreferences'] as const;
|
||||
};
|
||||
|
||||
export const getListUserPreferencesQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof listUserPreferences>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listUserPreferences>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getListUserPreferencesQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof listUserPreferences>>
|
||||
> = ({ signal }) => listUserPreferences(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listUserPreferences>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type ListUserPreferencesQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof listUserPreferences>>
|
||||
>;
|
||||
export type ListUserPreferencesQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary List user preferences
|
||||
*/
|
||||
|
||||
export function useListUserPreferences<
|
||||
TData = Awaited<ReturnType<typeof listUserPreferences>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof listUserPreferences>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getListUserPreferencesQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary List user preferences
|
||||
*/
|
||||
export const invalidateListUserPreferences = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getListUserPreferencesQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint returns the user preference by name
|
||||
* @summary Get user preference
|
||||
*/
|
||||
export const getUserPreference = (
|
||||
{ name }: GetUserPreferencePathParameters,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<GetUserPreference200>({
|
||||
url: `/api/v1/user/preferences/${name}`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetUserPreferenceQueryKey = ({
|
||||
name,
|
||||
}: GetUserPreferencePathParameters) => {
|
||||
return ['getUserPreference'] as const;
|
||||
};
|
||||
|
||||
export const getGetUserPreferenceQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getUserPreference>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ name }: GetUserPreferencePathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getUserPreference>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getGetUserPreferenceQueryKey({ name });
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getUserPreference>>
|
||||
> = ({ signal }) => getUserPreference({ name }, signal);
|
||||
|
||||
return {
|
||||
queryKey,
|
||||
queryFn,
|
||||
enabled: !!name,
|
||||
...queryOptions,
|
||||
} as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getUserPreference>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetUserPreferenceQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getUserPreference>>
|
||||
>;
|
||||
export type GetUserPreferenceQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get user preference
|
||||
*/
|
||||
|
||||
export function useGetUserPreference<
|
||||
TData = Awaited<ReturnType<typeof getUserPreference>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(
|
||||
{ name }: GetUserPreferencePathParameters,
|
||||
options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getUserPreference>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
},
|
||||
): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetUserPreferenceQueryOptions({ name }, options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get user preference
|
||||
*/
|
||||
export const invalidateGetUserPreference = async (
|
||||
queryClient: QueryClient,
|
||||
{ name }: GetUserPreferencePathParameters,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetUserPreferenceQueryKey({ name }) },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint updates the user preference by name
|
||||
* @summary Update user preference
|
||||
*/
|
||||
export const updateUserPreference = (
|
||||
{ name }: UpdateUserPreferencePathParameters,
|
||||
preferencetypesUpdatablePreferenceDTO: PreferencetypesUpdatablePreferenceDTO,
|
||||
) => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v1/user/preferences/${name}`,
|
||||
method: 'PUT',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: preferencetypesUpdatablePreferenceDTO,
|
||||
});
|
||||
};
|
||||
|
||||
export const getUpdateUserPreferenceMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateUserPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateUserPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateUserPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateUserPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['updateUserPreference'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof updateUserPreference>>,
|
||||
{
|
||||
pathParams: UpdateUserPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
}
|
||||
> = (props) => {
|
||||
const { pathParams, data } = props ?? {};
|
||||
|
||||
return updateUserPreference(pathParams, data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type UpdateUserPreferenceMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof updateUserPreference>>
|
||||
>;
|
||||
export type UpdateUserPreferenceMutationBody = PreferencetypesUpdatablePreferenceDTO;
|
||||
export type UpdateUserPreferenceMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Update user preference
|
||||
*/
|
||||
export const useUpdateUserPreference = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof updateUserPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateUserPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof updateUserPreference>>,
|
||||
TError,
|
||||
{
|
||||
pathParams: UpdateUserPreferencePathParameters;
|
||||
data: PreferencetypesUpdatablePreferenceDTO;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getUpdateUserPreferenceMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
662
frontend/src/api/generated/services/sessions/index.ts
Normal file
662
frontend/src/api/generated/services/sessions/index.ts
Normal file
@@ -0,0 +1,662 @@
|
||||
/**
|
||||
* ! Do not edit manually
|
||||
* * The file has been auto-generated using Orval for SigNoz
|
||||
* * regenerate with 'yarn generate:api'
|
||||
* SigNoz
|
||||
*/
|
||||
import { useMutation, useQuery } from 'react-query';
|
||||
import type {
|
||||
InvalidateOptions,
|
||||
MutationFunction,
|
||||
QueryClient,
|
||||
QueryFunction,
|
||||
QueryKey,
|
||||
UseMutationOptions,
|
||||
UseMutationResult,
|
||||
UseQueryOptions,
|
||||
UseQueryResult,
|
||||
} from 'react-query';
|
||||
|
||||
import type {
|
||||
AuthtypesPostableEmailPasswordSessionDTO,
|
||||
AuthtypesPostableRotateTokenDTO,
|
||||
CreateSessionByEmailPassword200,
|
||||
CreateSessionByGoogleCallback303,
|
||||
CreateSessionByOIDCCallback303,
|
||||
CreateSessionBySAMLCallback303,
|
||||
CreateSessionBySAMLCallbackBody,
|
||||
CreateSessionBySAMLCallbackParams,
|
||||
GetSessionContext200,
|
||||
RenderErrorResponseDTO,
|
||||
RotateSession200,
|
||||
} from '../sigNoz.schemas';
|
||||
|
||||
import { GeneratedAPIInstance } from '../../../index';
|
||||
|
||||
type AwaitedInput<T> = PromiseLike<T> | T;
|
||||
|
||||
type Awaited<O> = O extends AwaitedInput<infer T> ? T : never;
|
||||
|
||||
/**
|
||||
* This endpoint creates a session for a user using google callback
|
||||
* @summary Create session by google callback
|
||||
*/
|
||||
export const createSessionByGoogleCallback = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<unknown>({
|
||||
url: `/api/v1/complete/google`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateSessionByGoogleCallbackQueryKey = () => {
|
||||
return ['createSessionByGoogleCallback'] as const;
|
||||
};
|
||||
|
||||
export const getCreateSessionByGoogleCallbackQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof createSessionByGoogleCallback>>,
|
||||
TError = CreateSessionByGoogleCallback303 | RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof createSessionByGoogleCallback>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getCreateSessionByGoogleCallbackQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof createSessionByGoogleCallback>>
|
||||
> = ({ signal }) => createSessionByGoogleCallback(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof createSessionByGoogleCallback>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type CreateSessionByGoogleCallbackQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createSessionByGoogleCallback>>
|
||||
>;
|
||||
export type CreateSessionByGoogleCallbackQueryError =
|
||||
| CreateSessionByGoogleCallback303
|
||||
| RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create session by google callback
|
||||
*/
|
||||
|
||||
export function useCreateSessionByGoogleCallback<
|
||||
TData = Awaited<ReturnType<typeof createSessionByGoogleCallback>>,
|
||||
TError = CreateSessionByGoogleCallback303 | RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof createSessionByGoogleCallback>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getCreateSessionByGoogleCallbackQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Create session by google callback
|
||||
*/
|
||||
export const invalidateCreateSessionByGoogleCallback = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getCreateSessionByGoogleCallbackQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint creates a session for a user using oidc callback
|
||||
* @summary Create session by oidc callback
|
||||
*/
|
||||
export const createSessionByOIDCCallback = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<unknown>({
|
||||
url: `/api/v1/complete/oidc`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateSessionByOIDCCallbackQueryKey = () => {
|
||||
return ['createSessionByOIDCCallback'] as const;
|
||||
};
|
||||
|
||||
export const getCreateSessionByOIDCCallbackQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof createSessionByOIDCCallback>>,
|
||||
TError = CreateSessionByOIDCCallback303 | RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof createSessionByOIDCCallback>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey =
|
||||
queryOptions?.queryKey ?? getCreateSessionByOIDCCallbackQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof createSessionByOIDCCallback>>
|
||||
> = ({ signal }) => createSessionByOIDCCallback(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof createSessionByOIDCCallback>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type CreateSessionByOIDCCallbackQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createSessionByOIDCCallback>>
|
||||
>;
|
||||
export type CreateSessionByOIDCCallbackQueryError =
|
||||
| CreateSessionByOIDCCallback303
|
||||
| RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create session by oidc callback
|
||||
*/
|
||||
|
||||
export function useCreateSessionByOIDCCallback<
|
||||
TData = Awaited<ReturnType<typeof createSessionByOIDCCallback>>,
|
||||
TError = CreateSessionByOIDCCallback303 | RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof createSessionByOIDCCallback>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getCreateSessionByOIDCCallbackQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Create session by oidc callback
|
||||
*/
|
||||
export const invalidateCreateSessionByOIDCCallback = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getCreateSessionByOIDCCallbackQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint creates a session for a user using saml callback
|
||||
* @summary Create session by saml callback
|
||||
*/
|
||||
export const createSessionBySAMLCallback = (
|
||||
createSessionBySAMLCallbackBody: CreateSessionBySAMLCallbackBody,
|
||||
params?: CreateSessionBySAMLCallbackParams,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
const formUrlEncoded = new URLSearchParams();
|
||||
if (createSessionBySAMLCallbackBody.RelayState !== undefined) {
|
||||
formUrlEncoded.append(
|
||||
`RelayState`,
|
||||
createSessionBySAMLCallbackBody.RelayState,
|
||||
);
|
||||
}
|
||||
if (createSessionBySAMLCallbackBody.SAMLResponse !== undefined) {
|
||||
formUrlEncoded.append(
|
||||
`SAMLResponse`,
|
||||
createSessionBySAMLCallbackBody.SAMLResponse,
|
||||
);
|
||||
}
|
||||
|
||||
return GeneratedAPIInstance<unknown>({
|
||||
url: `/api/v1/complete/saml`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
|
||||
data: formUrlEncoded,
|
||||
params,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateSessionBySAMLCallbackMutationOptions = <
|
||||
TError = CreateSessionBySAMLCallback303 | RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createSessionBySAMLCallback>>,
|
||||
TError,
|
||||
{
|
||||
data: CreateSessionBySAMLCallbackBody;
|
||||
params?: CreateSessionBySAMLCallbackParams;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createSessionBySAMLCallback>>,
|
||||
TError,
|
||||
{
|
||||
data: CreateSessionBySAMLCallbackBody;
|
||||
params?: CreateSessionBySAMLCallbackParams;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['createSessionBySAMLCallback'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof createSessionBySAMLCallback>>,
|
||||
{
|
||||
data: CreateSessionBySAMLCallbackBody;
|
||||
params?: CreateSessionBySAMLCallbackParams;
|
||||
}
|
||||
> = (props) => {
|
||||
const { data, params } = props ?? {};
|
||||
|
||||
return createSessionBySAMLCallback(data, params);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type CreateSessionBySAMLCallbackMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createSessionBySAMLCallback>>
|
||||
>;
|
||||
export type CreateSessionBySAMLCallbackMutationBody = CreateSessionBySAMLCallbackBody;
|
||||
export type CreateSessionBySAMLCallbackMutationError =
|
||||
| CreateSessionBySAMLCallback303
|
||||
| RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create session by saml callback
|
||||
*/
|
||||
export const useCreateSessionBySAMLCallback = <
|
||||
TError = CreateSessionBySAMLCallback303 | RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createSessionBySAMLCallback>>,
|
||||
TError,
|
||||
{
|
||||
data: CreateSessionBySAMLCallbackBody;
|
||||
params?: CreateSessionBySAMLCallbackParams;
|
||||
},
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof createSessionBySAMLCallback>>,
|
||||
TError,
|
||||
{
|
||||
data: CreateSessionBySAMLCallbackBody;
|
||||
params?: CreateSessionBySAMLCallbackParams;
|
||||
},
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getCreateSessionBySAMLCallbackMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint deletes the session
|
||||
* @summary Delete session
|
||||
*/
|
||||
export const deleteSession = () => {
|
||||
return GeneratedAPIInstance<void>({
|
||||
url: `/api/v2/sessions`,
|
||||
method: 'DELETE',
|
||||
});
|
||||
};
|
||||
|
||||
export const getDeleteSessionMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteSession>>,
|
||||
TError,
|
||||
void,
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteSession>>,
|
||||
TError,
|
||||
void,
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['deleteSession'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof deleteSession>>,
|
||||
void
|
||||
> = () => {
|
||||
return deleteSession();
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type DeleteSessionMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof deleteSession>>
|
||||
>;
|
||||
|
||||
export type DeleteSessionMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Delete session
|
||||
*/
|
||||
export const useDeleteSession = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof deleteSession>>,
|
||||
TError,
|
||||
void,
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof deleteSession>>,
|
||||
TError,
|
||||
void,
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getDeleteSessionMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint returns the context for the session
|
||||
* @summary Get session context
|
||||
*/
|
||||
export const getSessionContext = (signal?: AbortSignal) => {
|
||||
return GeneratedAPIInstance<GetSessionContext200>({
|
||||
url: `/api/v2/sessions/context`,
|
||||
method: 'GET',
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getGetSessionContextQueryKey = () => {
|
||||
return ['getSessionContext'] as const;
|
||||
};
|
||||
|
||||
export const getGetSessionContextQueryOptions = <
|
||||
TData = Awaited<ReturnType<typeof getSessionContext>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getSessionContext>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}) => {
|
||||
const { query: queryOptions } = options ?? {};
|
||||
|
||||
const queryKey = queryOptions?.queryKey ?? getGetSessionContextQueryKey();
|
||||
|
||||
const queryFn: QueryFunction<
|
||||
Awaited<ReturnType<typeof getSessionContext>>
|
||||
> = ({ signal }) => getSessionContext(signal);
|
||||
|
||||
return { queryKey, queryFn, ...queryOptions } as UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getSessionContext>>,
|
||||
TError,
|
||||
TData
|
||||
> & { queryKey: QueryKey };
|
||||
};
|
||||
|
||||
export type GetSessionContextQueryResult = NonNullable<
|
||||
Awaited<ReturnType<typeof getSessionContext>>
|
||||
>;
|
||||
export type GetSessionContextQueryError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Get session context
|
||||
*/
|
||||
|
||||
export function useGetSessionContext<
|
||||
TData = Awaited<ReturnType<typeof getSessionContext>>,
|
||||
TError = RenderErrorResponseDTO
|
||||
>(options?: {
|
||||
query?: UseQueryOptions<
|
||||
Awaited<ReturnType<typeof getSessionContext>>,
|
||||
TError,
|
||||
TData
|
||||
>;
|
||||
}): UseQueryResult<TData, TError> & { queryKey: QueryKey } {
|
||||
const queryOptions = getGetSessionContextQueryOptions(options);
|
||||
|
||||
const query = useQuery(queryOptions) as UseQueryResult<TData, TError> & {
|
||||
queryKey: QueryKey;
|
||||
};
|
||||
|
||||
query.queryKey = queryOptions.queryKey;
|
||||
|
||||
return query;
|
||||
}
|
||||
|
||||
/**
|
||||
* @summary Get session context
|
||||
*/
|
||||
export const invalidateGetSessionContext = async (
|
||||
queryClient: QueryClient,
|
||||
options?: InvalidateOptions,
|
||||
): Promise<QueryClient> => {
|
||||
await queryClient.invalidateQueries(
|
||||
{ queryKey: getGetSessionContextQueryKey() },
|
||||
options,
|
||||
);
|
||||
|
||||
return queryClient;
|
||||
};
|
||||
|
||||
/**
|
||||
* This endpoint creates a session for a user using email and password.
|
||||
* @summary Create session by email and password
|
||||
*/
|
||||
export const createSessionByEmailPassword = (
|
||||
authtypesPostableEmailPasswordSessionDTO: AuthtypesPostableEmailPasswordSessionDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<CreateSessionByEmailPassword200>({
|
||||
url: `/api/v2/sessions/email_password`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: authtypesPostableEmailPasswordSessionDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getCreateSessionByEmailPasswordMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createSessionByEmailPassword>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableEmailPasswordSessionDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createSessionByEmailPassword>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableEmailPasswordSessionDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['createSessionByEmailPassword'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof createSessionByEmailPassword>>,
|
||||
{ data: AuthtypesPostableEmailPasswordSessionDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return createSessionByEmailPassword(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type CreateSessionByEmailPasswordMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof createSessionByEmailPassword>>
|
||||
>;
|
||||
export type CreateSessionByEmailPasswordMutationBody = AuthtypesPostableEmailPasswordSessionDTO;
|
||||
export type CreateSessionByEmailPasswordMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Create session by email and password
|
||||
*/
|
||||
export const useCreateSessionByEmailPassword = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof createSessionByEmailPassword>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableEmailPasswordSessionDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof createSessionByEmailPassword>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableEmailPasswordSessionDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getCreateSessionByEmailPasswordMutationOptions(
|
||||
options,
|
||||
);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
/**
|
||||
* This endpoint rotates the session
|
||||
* @summary Rotate session
|
||||
*/
|
||||
export const rotateSession = (
|
||||
authtypesPostableRotateTokenDTO: AuthtypesPostableRotateTokenDTO,
|
||||
signal?: AbortSignal,
|
||||
) => {
|
||||
return GeneratedAPIInstance<RotateSession200>({
|
||||
url: `/api/v2/sessions/rotate`,
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
data: authtypesPostableRotateTokenDTO,
|
||||
signal,
|
||||
});
|
||||
};
|
||||
|
||||
export const getRotateSessionMutationOptions = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof rotateSession>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableRotateTokenDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationOptions<
|
||||
Awaited<ReturnType<typeof rotateSession>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableRotateTokenDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationKey = ['rotateSession'];
|
||||
const { mutation: mutationOptions } = options
|
||||
? options.mutation &&
|
||||
'mutationKey' in options.mutation &&
|
||||
options.mutation.mutationKey
|
||||
? options
|
||||
: { ...options, mutation: { ...options.mutation, mutationKey } }
|
||||
: { mutation: { mutationKey } };
|
||||
|
||||
const mutationFn: MutationFunction<
|
||||
Awaited<ReturnType<typeof rotateSession>>,
|
||||
{ data: AuthtypesPostableRotateTokenDTO }
|
||||
> = (props) => {
|
||||
const { data } = props ?? {};
|
||||
|
||||
return rotateSession(data);
|
||||
};
|
||||
|
||||
return { mutationFn, ...mutationOptions };
|
||||
};
|
||||
|
||||
export type RotateSessionMutationResult = NonNullable<
|
||||
Awaited<ReturnType<typeof rotateSession>>
|
||||
>;
|
||||
export type RotateSessionMutationBody = AuthtypesPostableRotateTokenDTO;
|
||||
export type RotateSessionMutationError = RenderErrorResponseDTO;
|
||||
|
||||
/**
|
||||
* @summary Rotate session
|
||||
*/
|
||||
export const useRotateSession = <
|
||||
TError = RenderErrorResponseDTO,
|
||||
TContext = unknown
|
||||
>(options?: {
|
||||
mutation?: UseMutationOptions<
|
||||
Awaited<ReturnType<typeof rotateSession>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableRotateTokenDTO },
|
||||
TContext
|
||||
>;
|
||||
}): UseMutationResult<
|
||||
Awaited<ReturnType<typeof rotateSession>>,
|
||||
TError,
|
||||
{ data: AuthtypesPostableRotateTokenDTO },
|
||||
TContext
|
||||
> => {
|
||||
const mutationOptions = getRotateSessionMutationOptions(options);
|
||||
|
||||
return useMutation(mutationOptions);
|
||||
};
|
||||
1933
frontend/src/api/generated/services/sigNoz.schemas.ts
Normal file
1933
frontend/src/api/generated/services/sigNoz.schemas.ts
Normal file
File diff suppressed because it is too large
Load Diff
1570
frontend/src/api/generated/services/users/index.ts
Normal file
1570
frontend/src/api/generated/services/users/index.ts
Normal file
File diff suppressed because it is too large
Load Diff
@@ -229,6 +229,17 @@ export const GatewayApiV2Instance = axios.create({
|
||||
baseURL: `${ENVIRONMENT.baseURL}${gatewayApiV2}`,
|
||||
});
|
||||
|
||||
// generated API Instance
|
||||
export const GeneratedAPIInstance = axios.create({
|
||||
baseURL: ENVIRONMENT.baseURL,
|
||||
});
|
||||
|
||||
GeneratedAPIInstance.interceptors.request.use(interceptorsRequestResponse);
|
||||
GeneratedAPIInstance.interceptors.response.use(
|
||||
interceptorsResponse,
|
||||
interceptorRejected,
|
||||
);
|
||||
|
||||
GatewayApiV2Instance.interceptors.response.use(
|
||||
interceptorsResponse,
|
||||
interceptorRejected,
|
||||
|
||||
@@ -75,7 +75,9 @@ export const getK8sClustersList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -83,7 +83,9 @@ export const getK8sDaemonSetsList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -82,7 +82,9 @@ export const getK8sDeploymentsList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -82,7 +82,9 @@ export const getK8sJobsList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -73,7 +73,9 @@ export const getK8sNamespacesList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -77,7 +77,9 @@ export const getK8sNodesList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -113,7 +113,9 @@ export const getK8sPodsList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -98,7 +98,9 @@ export const getK8sVolumesList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -81,7 +81,9 @@ export const getK8sStatefulSetsList = async (
|
||||
...props.filters,
|
||||
items: props.filters.items.reduce<typeof props.filters.items>(
|
||||
(acc, item) => {
|
||||
if (item.value === undefined) return acc;
|
||||
if (item.value === undefined) {
|
||||
return acc;
|
||||
}
|
||||
if (
|
||||
item.key &&
|
||||
typeof item.key === 'object' &&
|
||||
|
||||
@@ -41,7 +41,7 @@ export const getConsumerLagDetails = async (
|
||||
> => {
|
||||
const { detailType, ...restProps } = props;
|
||||
const response = await axios.post(
|
||||
`/messaging-queues/kafka/consumer-lag/${props.detailType}`,
|
||||
`/messaging-queues/kafka/consumer-lag/${detailType}`,
|
||||
{
|
||||
...restProps,
|
||||
},
|
||||
|
||||
@@ -68,8 +68,12 @@ export function mapPanelTypeToRequestType(panelType: PANEL_TYPES): RequestType {
|
||||
* Gets signal type from data source
|
||||
*/
|
||||
function getSignalType(dataSource: string): 'traces' | 'logs' | 'metrics' {
|
||||
if (dataSource === 'traces') return 'traces';
|
||||
if (dataSource === 'logs') return 'logs';
|
||||
if (dataSource === 'traces') {
|
||||
return 'traces';
|
||||
}
|
||||
if (dataSource === 'logs') {
|
||||
return 'logs';
|
||||
}
|
||||
return 'metrics';
|
||||
}
|
||||
|
||||
@@ -509,7 +513,9 @@ function reduceQueriesToObject(
|
||||
// eslint-disable-line @typescript-eslint/no-explicit-any
|
||||
const legends: Record<string, string> = {};
|
||||
const queries = queryArray.reduce((acc, queryItem) => {
|
||||
if (!queryItem.query) return acc;
|
||||
if (!queryItem.query) {
|
||||
return acc;
|
||||
}
|
||||
acc[queryItem.name] = queryItem;
|
||||
legends[queryItem.name] = queryItem.legend;
|
||||
return acc;
|
||||
|
||||
1
frontend/src/auto-import-registry.d.ts
vendored
1
frontend/src/auto-import-registry.d.ts
vendored
@@ -15,6 +15,7 @@ import '@signozhq/button';
|
||||
import '@signozhq/calendar';
|
||||
import '@signozhq/callout';
|
||||
import '@signozhq/checkbox';
|
||||
import '@signozhq/combobox';
|
||||
import '@signozhq/command';
|
||||
import '@signozhq/design-tokens';
|
||||
import '@signozhq/input';
|
||||
|
||||
@@ -20,7 +20,6 @@ import {
|
||||
getQueueOverview,
|
||||
QueueOverviewResponse,
|
||||
} from 'api/messagingQueues/celery/getQueueOverview';
|
||||
import { isNumber } from 'chart.js/helpers';
|
||||
import { ResizeTable } from 'components/ResizeTable';
|
||||
import { LOCALSTORAGE } from 'constants/localStorage';
|
||||
import { QueryParams } from 'constants/query';
|
||||
@@ -33,6 +32,7 @@ import { useMutation } from 'react-query';
|
||||
import { useSelector } from 'react-redux';
|
||||
import { AppState } from 'store/reducers';
|
||||
import { GlobalReducer } from 'types/reducer/globalTime';
|
||||
import { formatNumericValue } from 'utils/numericUtils';
|
||||
|
||||
const INITIAL_PAGE_SIZE = 20;
|
||||
|
||||
@@ -60,8 +60,12 @@ function ProgressRender(item: string | number): JSX.Element {
|
||||
size="small"
|
||||
strokeColor={((): string => {
|
||||
const cpuPercent = percent;
|
||||
if (cpuPercent >= 90) return Color.BG_SAKURA_500;
|
||||
if (cpuPercent >= 60) return Color.BG_AMBER_500;
|
||||
if (cpuPercent >= 90) {
|
||||
return Color.BG_SAKURA_500;
|
||||
}
|
||||
if (cpuPercent >= 60) {
|
||||
return Color.BG_AMBER_500;
|
||||
}
|
||||
return Color.BG_FOREST_500;
|
||||
})()}
|
||||
className="progress-bar"
|
||||
@@ -239,10 +243,7 @@ function getColumns(data: RowData[]): TableColumnsType<RowData> {
|
||||
const bValue = Number(b.p95_latency);
|
||||
return aValue - bValue;
|
||||
},
|
||||
render: (value: number | string): string => {
|
||||
if (!isNumber(value)) return value.toString();
|
||||
return (typeof value === 'string' ? parseFloat(value) : value).toFixed(3);
|
||||
},
|
||||
render: formatNumericValue,
|
||||
},
|
||||
{
|
||||
title: 'THROUGHPUT (ops/s)',
|
||||
@@ -257,10 +258,7 @@ function getColumns(data: RowData[]): TableColumnsType<RowData> {
|
||||
const bValue = Number(b.throughput);
|
||||
return aValue - bValue;
|
||||
},
|
||||
render: (value: number | string): string => {
|
||||
if (!isNumber(value)) return value.toString();
|
||||
return (typeof value === 'string' ? parseFloat(value) : value).toFixed(3);
|
||||
},
|
||||
render: formatNumericValue,
|
||||
},
|
||||
];
|
||||
}
|
||||
@@ -337,7 +335,9 @@ function makeFilters(urlQuery: URLSearchParams): Filter[] {
|
||||
return filterConfigs
|
||||
.map(({ paramName, operator, key }) => {
|
||||
const value = urlQuery.get(paramName);
|
||||
if (!value) return null;
|
||||
if (!value) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return {
|
||||
key: {
|
||||
@@ -464,7 +464,9 @@ export default function CeleryOverviewTable({
|
||||
|
||||
const getFilteredData = useCallback(
|
||||
(data: RowData[]): RowData[] => {
|
||||
if (!searchText) return data;
|
||||
if (!searchText) {
|
||||
return data;
|
||||
}
|
||||
|
||||
const searchLower = searchText.toLowerCase();
|
||||
return data.filter((record) =>
|
||||
|
||||
@@ -69,7 +69,9 @@ export function useGetAllFilters(props: Filters): GetAllFiltersResponse {
|
||||
const uniqueValues = [
|
||||
...new Set(
|
||||
responses.flatMap(({ payload }) => {
|
||||
if (!payload) return [];
|
||||
if (!payload) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const dataType = filterAttributeKeyDataType || DataTypes.String;
|
||||
const key = DATA_TYPE_VS_ATTRIBUTE_VALUES_KEY[dataType];
|
||||
|
||||
@@ -11,12 +11,24 @@ import { v4 as uuidv4 } from 'uuid';
|
||||
export const getStepInterval = (startTime: number, endTime: number): number => {
|
||||
const diffInMinutes = (endTime - startTime) / 1000000 / (60 * 1000); // Convert to minutes
|
||||
|
||||
if (diffInMinutes <= 15) return 60; // 15 min or less
|
||||
if (diffInMinutes <= 30) return 60; // 30 min or less
|
||||
if (diffInMinutes <= 60) return 120; // 1 hour or less
|
||||
if (diffInMinutes <= 360) return 520; // 6 hours or less
|
||||
if (diffInMinutes <= 1440) return 2440; // 1 day or less
|
||||
if (diffInMinutes <= 10080) return 10080; // 1 week or less
|
||||
if (diffInMinutes <= 15) {
|
||||
return 60;
|
||||
} // 15 min or less
|
||||
if (diffInMinutes <= 30) {
|
||||
return 60;
|
||||
} // 30 min or less
|
||||
if (diffInMinutes <= 60) {
|
||||
return 120;
|
||||
} // 1 hour or less
|
||||
if (diffInMinutes <= 360) {
|
||||
return 520;
|
||||
} // 6 hours or less
|
||||
if (diffInMinutes <= 1440) {
|
||||
return 2440;
|
||||
} // 1 day or less
|
||||
if (diffInMinutes <= 10080) {
|
||||
return 10080;
|
||||
} // 1 week or less
|
||||
return 54000; // More than a week (use monthly interval)
|
||||
};
|
||||
|
||||
|
||||
@@ -49,8 +49,12 @@ export const useGetValueFromWidget = (
|
||||
const isError = queries.some((query) => query.isError);
|
||||
|
||||
const values = queries.map((query) => {
|
||||
if (query.isLoading) return 'Loading...';
|
||||
if (query.isError) return 'Error';
|
||||
if (query.isLoading) {
|
||||
return 'Loading...';
|
||||
}
|
||||
if (query.isError) {
|
||||
return 'Error';
|
||||
}
|
||||
|
||||
const value = parseFloat(
|
||||
query.data?.payload?.data?.newResult?.data?.result?.[0]?.series?.[0]
|
||||
|
||||
@@ -49,7 +49,9 @@ export function useNavigateToExplorer(): (
|
||||
...(item.filters?.items || []),
|
||||
...selectedFilters,
|
||||
].filter((item) => {
|
||||
if (seen.has(item.id)) return false;
|
||||
if (seen.has(item.id)) {
|
||||
return false;
|
||||
}
|
||||
seen.add(item.id);
|
||||
return true;
|
||||
});
|
||||
|
||||
@@ -440,9 +440,12 @@ function ClientSideQBSearch(
|
||||
const values: Array<string | number | boolean> = [];
|
||||
const { tagValue } = getTagToken(searchValue);
|
||||
if (isArray(tagValue)) {
|
||||
if (!isEmpty(tagValue[tagValue.length - 1]))
|
||||
if (!isEmpty(tagValue[tagValue.length - 1])) {
|
||||
values.push(tagValue[tagValue.length - 1]);
|
||||
} else if (!isEmpty(tagValue)) values.push(tagValue);
|
||||
}
|
||||
} else if (!isEmpty(tagValue)) {
|
||||
values.push(tagValue);
|
||||
}
|
||||
|
||||
const currentAttributeValues =
|
||||
attributeValues?.stringAttributeValues ||
|
||||
@@ -556,7 +559,9 @@ function ClientSideQBSearch(
|
||||
disabled={isDisabled}
|
||||
$isEnabled={!!searchValue}
|
||||
onClick={(): void => {
|
||||
if (!isDisabled) tagEditHandler(value);
|
||||
if (!isDisabled) {
|
||||
tagEditHandler(value);
|
||||
}
|
||||
}}
|
||||
>
|
||||
{chipValue}
|
||||
|
||||
@@ -12,7 +12,7 @@ import {
|
||||
FixedDurationSuggestionOptions,
|
||||
Options,
|
||||
RelativeDurationSuggestionOptions,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/constants';
|
||||
import dayjs from 'dayjs';
|
||||
import { isValidShortHandDateTimeFormat } from 'lib/getMinMax';
|
||||
import { defaultTo, isFunction, noop } from 'lodash-es';
|
||||
|
||||
@@ -8,11 +8,11 @@ import { DATE_TIME_FORMATS } from 'constants/dateTimeFormats';
|
||||
import { QueryParams } from 'constants/query';
|
||||
import ROUTES from 'constants/routes';
|
||||
import { DateTimeRangeType } from 'container/TopNav/CustomDateTimeModal';
|
||||
import { RelativeDurationSuggestionOptions } from 'container/TopNav/DateTimeSelectionV2/constants';
|
||||
import {
|
||||
LexicalContext,
|
||||
Option,
|
||||
RelativeDurationSuggestionOptions,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/types';
|
||||
import dayjs from 'dayjs';
|
||||
import { Clock, PenLine, TriangleAlertIcon } from 'lucide-react';
|
||||
import { useTimezone } from 'providers/Timezone';
|
||||
|
||||
@@ -8,7 +8,7 @@ import {
|
||||
CustomTimeType,
|
||||
LexicalContext,
|
||||
Time,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/types';
|
||||
import dayjs, { Dayjs } from 'dayjs';
|
||||
import { useTimezone } from 'providers/Timezone';
|
||||
import { Dispatch, SetStateAction, useMemo } from 'react';
|
||||
|
||||
@@ -38,7 +38,9 @@ const normalizeTimezoneName = (timezone: string): string => {
|
||||
};
|
||||
|
||||
const formatOffset = (offsetMinutes: number): string => {
|
||||
if (offsetMinutes === 0) return 'UTC';
|
||||
if (offsetMinutes === 0) {
|
||||
return 'UTC';
|
||||
}
|
||||
|
||||
const hours = Math.floor(Math.abs(offsetMinutes) / 60);
|
||||
const minutes = Math.abs(offsetMinutes) % 60;
|
||||
|
||||
@@ -16,7 +16,9 @@ function DraggableTableRow({
|
||||
|
||||
const handleDrop = useCallback(
|
||||
(item: { index: number }) => {
|
||||
if (moveRow) moveRow(item.index, index);
|
||||
if (moveRow) {
|
||||
moveRow(item.index, index);
|
||||
}
|
||||
},
|
||||
[moveRow, index],
|
||||
);
|
||||
|
||||
@@ -13,9 +13,13 @@ function Editor({
|
||||
const isDarkMode = useIsDarkMode();
|
||||
|
||||
const onChangeHandler = (newValue?: string): void => {
|
||||
if (readOnly) return;
|
||||
if (readOnly) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (typeof newValue === 'string' && onChange) onChange(newValue);
|
||||
if (typeof newValue === 'string' && onChange) {
|
||||
onChange(newValue);
|
||||
}
|
||||
};
|
||||
|
||||
const editorOptions = useMemo(
|
||||
|
||||
@@ -52,7 +52,9 @@ function MenuItemGenerator({
|
||||
const onMenuItemSelectHandler = useCallback(
|
||||
({ key }: { key: string }): void => {
|
||||
const currentViewDetails = getViewDetailsUsingViewKey(key, viewData);
|
||||
if (!currentViewDetails) return;
|
||||
if (!currentViewDetails) {
|
||||
return;
|
||||
}
|
||||
const { query, name, id, panelType: currentPanelType } = currentViewDetails;
|
||||
|
||||
handleExplorerTabChange(currentPanelType, {
|
||||
|
||||
@@ -43,16 +43,17 @@ export const omitIdFromQuery = (query: Query | null): any => ({
|
||||
builder: {
|
||||
...query?.builder,
|
||||
queryData: query?.builder.queryData.map((queryData) => {
|
||||
const { id, ...rest } = queryData.aggregateAttribute || {};
|
||||
const { id: _aggregateAttributeId, ...rest } =
|
||||
queryData.aggregateAttribute || {};
|
||||
const newAggregateAttribute = rest;
|
||||
const newGroupByAttributes = queryData.groupBy.map((groupByAttribute) => {
|
||||
const { id, ...rest } = groupByAttribute;
|
||||
const { id: _groupByAttributeId, ...rest } = groupByAttribute;
|
||||
return rest;
|
||||
});
|
||||
const newItems = queryData.filters?.items?.map((item) => {
|
||||
const { id, ...newItem } = item;
|
||||
const { id: _itemId, ...newItem } = item;
|
||||
if (item.key) {
|
||||
const { id, ...rest } = item.key;
|
||||
const { id: _keyId, ...rest } = item.key;
|
||||
return {
|
||||
...newItem,
|
||||
key: rest,
|
||||
|
||||
@@ -28,9 +28,15 @@ export const getYAxisFormattedValue = (
|
||||
const numValue = parseFloat(value);
|
||||
|
||||
// Handle non-numeric or special values first.
|
||||
if (isNaN(numValue)) return 'NaN';
|
||||
if (numValue === Infinity) return '∞';
|
||||
if (numValue === -Infinity) return '-∞';
|
||||
if (isNaN(numValue)) {
|
||||
return 'NaN';
|
||||
}
|
||||
if (numValue === Infinity) {
|
||||
return '∞';
|
||||
}
|
||||
if (numValue === -Infinity) {
|
||||
return '-∞';
|
||||
}
|
||||
|
||||
// For all other standard formats, delegate to grafana/data's built-in formatter.
|
||||
const computeDecimals = (): number | undefined => {
|
||||
@@ -41,8 +47,12 @@ export const getYAxisFormattedValue = (
|
||||
};
|
||||
|
||||
const fallbackFormat = (): string => {
|
||||
if (precision === PrecisionOptionsEnum.FULL) return numValue.toString();
|
||||
if (precision === 0) return Math.round(numValue).toString();
|
||||
if (precision === PrecisionOptionsEnum.FULL) {
|
||||
return numValue.toString();
|
||||
}
|
||||
if (precision === 0) {
|
||||
return Math.round(numValue).toString();
|
||||
}
|
||||
return precision !== undefined
|
||||
? numValue
|
||||
.toFixed(precision)
|
||||
|
||||
@@ -13,7 +13,7 @@ import DateTimeSelectionV2 from 'container/TopNav/DateTimeSelectionV2';
|
||||
import {
|
||||
CustomTimeType,
|
||||
Time,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/types';
|
||||
import TraceExplorerControls from 'container/TracesExplorer/Controls';
|
||||
import { PER_PAGE_OPTIONS } from 'container/TracesExplorer/ListView/configs';
|
||||
import { TracesLoading } from 'container/TracesExplorer/TraceLoading/TraceLoading';
|
||||
|
||||
@@ -24,7 +24,7 @@ import { INFRA_MONITORING_K8S_PARAMS_KEYS } from 'container/InfraMonitoringK8s/c
|
||||
import {
|
||||
CustomTimeType,
|
||||
Time,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/types';
|
||||
import { useIsDarkMode } from 'hooks/useDarkMode';
|
||||
import useUrlQuery from 'hooks/useUrlQuery';
|
||||
import GetMinMax from 'lib/getMinMax';
|
||||
@@ -450,8 +450,12 @@ function HostMetricsDetails({
|
||||
size="small"
|
||||
strokeColor={((): string => {
|
||||
const cpuPercent = Number((host.cpu * 100).toFixed(1));
|
||||
if (cpuPercent >= 90) return Color.BG_SAKURA_500;
|
||||
if (cpuPercent >= 60) return Color.BG_AMBER_500;
|
||||
if (cpuPercent >= 90) {
|
||||
return Color.BG_SAKURA_500;
|
||||
}
|
||||
if (cpuPercent >= 60) {
|
||||
return Color.BG_AMBER_500;
|
||||
}
|
||||
return Color.BG_FOREST_500;
|
||||
})()}
|
||||
className="progress-bar"
|
||||
@@ -463,8 +467,12 @@ function HostMetricsDetails({
|
||||
size="small"
|
||||
strokeColor={((): string => {
|
||||
const memoryPercent = Number((host.memory * 100).toFixed(1));
|
||||
if (memoryPercent >= 90) return Color.BG_CHERRY_500;
|
||||
if (memoryPercent >= 60) return Color.BG_AMBER_500;
|
||||
if (memoryPercent >= 90) {
|
||||
return Color.BG_CHERRY_500;
|
||||
}
|
||||
if (memoryPercent >= 60) {
|
||||
return Color.BG_AMBER_500;
|
||||
}
|
||||
return Color.BG_FOREST_500;
|
||||
})()}
|
||||
className="progress-bar"
|
||||
|
||||
@@ -5,7 +5,7 @@ import DateTimeSelectionV2 from 'container/TopNav/DateTimeSelectionV2';
|
||||
import {
|
||||
CustomTimeType,
|
||||
Time,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/types';
|
||||
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
|
||||
import { useMemo } from 'react';
|
||||
import { IBuilderQuery } from 'types/api/queryBuilder/queryBuilderData';
|
||||
|
||||
@@ -12,7 +12,7 @@ import DateTimeSelectionV2 from 'container/TopNav/DateTimeSelectionV2';
|
||||
import {
|
||||
CustomTimeType,
|
||||
Time,
|
||||
} from 'container/TopNav/DateTimeSelectionV2/config';
|
||||
} from 'container/TopNav/DateTimeSelectionV2/types';
|
||||
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
|
||||
import { useIsDarkMode } from 'hooks/useDarkMode';
|
||||
import { useResizeObserver } from 'hooks/useDimensions';
|
||||
|
||||
@@ -22,7 +22,9 @@ export default function WaitlistFragment({
|
||||
const [isSuccess, setIsSuccess] = useState(false);
|
||||
|
||||
const handleJoinWaitlist = (): void => {
|
||||
if (!user || !user.email) return;
|
||||
if (!user || !user.email) {
|
||||
return;
|
||||
}
|
||||
|
||||
setIsSubmitting(true);
|
||||
|
||||
|
||||
@@ -29,8 +29,9 @@ function QueryBuilderSearchWrapper({
|
||||
(!tagFiltersLength && (!filters || !filters.items.length)) ||
|
||||
tagFiltersLength === filters?.items.length ||
|
||||
!contextQuery
|
||||
)
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
const nextQuery: Query = {
|
||||
...contextQuery,
|
||||
@@ -48,7 +49,9 @@ function QueryBuilderSearchWrapper({
|
||||
};
|
||||
|
||||
// eslint-disable-next-line react/jsx-no-useless-fragment
|
||||
if (!contextQuery || !isEdit) return <></>;
|
||||
if (!contextQuery || !isEdit) {
|
||||
return <></>;
|
||||
}
|
||||
|
||||
return (
|
||||
<QueryBuilderSearch
|
||||
|
||||
@@ -75,7 +75,9 @@ function LogDetailInner({
|
||||
const { stagedQuery, updateAllQueriesOperators } = useQueryBuilder();
|
||||
|
||||
const listQuery = useMemo(() => {
|
||||
if (!stagedQuery || stagedQuery.builder.queryData.length < 1) return null;
|
||||
if (!stagedQuery || stagedQuery.builder.queryData.length < 1) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return stagedQuery.builder.queryData.find((item) => !item.disabled) || null;
|
||||
}, [stagedQuery]);
|
||||
@@ -153,7 +155,9 @@ function LogDetailInner({
|
||||
(value: string, queryIndex: number) => {
|
||||
// update the query at the given index
|
||||
setContextQuery((prev) => {
|
||||
if (!prev) return prev;
|
||||
if (!prev) {
|
||||
return prev;
|
||||
}
|
||||
|
||||
return {
|
||||
...prev,
|
||||
|
||||
@@ -127,7 +127,9 @@ function RawLogView({
|
||||
|
||||
const handleClickExpand = useCallback(
|
||||
(event: MouseEvent) => {
|
||||
if (isReadOnly) return;
|
||||
if (isReadOnly) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Use custom click handler if provided, otherwise use default behavior
|
||||
if (onLogClick) {
|
||||
|
||||
@@ -52,7 +52,9 @@ function OptionsMenu({
|
||||
|
||||
const onChange = useCallback(
|
||||
(key: LogViewMode) => {
|
||||
if (!format) return;
|
||||
if (!format) {
|
||||
return;
|
||||
}
|
||||
|
||||
format.onChange(key);
|
||||
},
|
||||
@@ -148,7 +150,9 @@ function OptionsMenu({
|
||||
}
|
||||
|
||||
const handleKeyDown = (e: KeyboardEvent): void => {
|
||||
if (!selectedValue) return;
|
||||
if (!selectedValue) {
|
||||
return;
|
||||
}
|
||||
|
||||
const optionsData = addColumn?.options || [];
|
||||
|
||||
|
||||
@@ -45,7 +45,6 @@ function Pre({
|
||||
}
|
||||
|
||||
function Code({
|
||||
node,
|
||||
inline,
|
||||
className = 'blog-code',
|
||||
children,
|
||||
|
||||
@@ -13,7 +13,9 @@ function MessageTip({
|
||||
message,
|
||||
action,
|
||||
}: MessageTipProps): JSX.Element | null {
|
||||
if (!show) return null;
|
||||
if (!show) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<StyledAlert showIcon description={message} type="info" action={action} />
|
||||
|
||||
@@ -165,7 +165,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
return;
|
||||
}
|
||||
|
||||
if (!onChange) return;
|
||||
if (!onChange) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Case 1: Cleared (empty array or undefined)
|
||||
if (!newValue || currentNewValue.length === 0) {
|
||||
@@ -327,7 +329,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
* Selects all chips
|
||||
*/
|
||||
const selectAllChips = useCallback((): void => {
|
||||
if (selectedValues.length === 0) return;
|
||||
if (selectedValues.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
// When maxTagCount is set, only select visible chips
|
||||
const visibleCount =
|
||||
@@ -394,7 +398,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
* Handle copy event
|
||||
*/
|
||||
const handleCopy = useCallback((): void => {
|
||||
if (selectedChips.length === 0) return;
|
||||
if (selectedChips.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const selectedTexts = selectedChips
|
||||
.sort((a, b) => a - b)
|
||||
@@ -409,7 +415,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
* Handle cut event
|
||||
*/
|
||||
const handleCut = useCallback((): void => {
|
||||
if (selectedChips.length === 0) return;
|
||||
if (selectedChips.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
// First copy the content
|
||||
handleCopy();
|
||||
@@ -577,7 +585,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
}
|
||||
}
|
||||
|
||||
if (onSearch) onSearch(trimmedValue);
|
||||
if (onSearch) {
|
||||
onSearch(trimmedValue);
|
||||
}
|
||||
},
|
||||
[
|
||||
onSearch,
|
||||
@@ -596,7 +606,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
*/
|
||||
const highlightMatchedText = useCallback(
|
||||
(text: string, searchQuery: string): React.ReactNode => {
|
||||
if (!searchQuery || !highlightSearch) return text;
|
||||
if (!searchQuery || !highlightSearch) {
|
||||
return text;
|
||||
}
|
||||
|
||||
try {
|
||||
const parts = text.split(
|
||||
@@ -632,7 +644,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
|
||||
// Adjusted handleSelectAll for internal change handler
|
||||
const handleSelectAll = useCallback((): void => {
|
||||
if (!options) return;
|
||||
if (!options) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (isAllSelected) {
|
||||
// If all are selected, deselect all
|
||||
@@ -658,7 +672,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
const handleItemSelection = (source?: string): void => {
|
||||
// Special handling for ALL option is done by the caller
|
||||
|
||||
if (!option.value) return;
|
||||
if (!option.value) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (source === 'option') {
|
||||
if (
|
||||
@@ -792,7 +808,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
// Helper function to get visible chip indices
|
||||
const getVisibleChipIndices = useCallback((): number[] => {
|
||||
// If no values, return empty array
|
||||
if (selectedValues.length === 0) return [];
|
||||
if (selectedValues.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
// If maxTagCount is set and greater than 0, only return the first maxTagCount indices
|
||||
const visibleCount =
|
||||
@@ -836,7 +854,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
|
||||
// Get flattened list of all selectable options
|
||||
const getFlatOptions = (): OptionData[] => {
|
||||
if (!visibleOptions) return [];
|
||||
if (!visibleOptions) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const flatList: OptionData[] = [];
|
||||
const hasAll = enableAllSelection && !searchText;
|
||||
@@ -1845,7 +1865,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
// but base indices/visibility on the original `selectedValues`
|
||||
if (!isAllSelected) {
|
||||
const index = selectedValues.indexOf(value);
|
||||
if (index === -1) return <div style={{ display: 'none' }} />; // Should not happen if value comes from displayValue
|
||||
if (index === -1) {
|
||||
return <div style={{ display: 'none' }} />;
|
||||
} // Should not happen if value comes from displayValue
|
||||
|
||||
const isActive = index === activeChipIndex;
|
||||
const isSelected = selectedChips.includes(index);
|
||||
@@ -1931,7 +1953,9 @@ const CustomMultiSelect: React.FC<CustomMultiSelectProps> = ({
|
||||
|
||||
// Normal clear behavior
|
||||
handleInternalChange([], true);
|
||||
if (onClear) onClear();
|
||||
if (onClear) {
|
||||
onClear();
|
||||
}
|
||||
}, [onClear, handleInternalChange, allOptionShown, isAllSelected]);
|
||||
|
||||
// ===== Component Rendering =====
|
||||
|
||||
@@ -147,7 +147,9 @@ const CustomSelect: React.FC<CustomSelectProps> = ({
|
||||
*/
|
||||
const highlightMatchedText = useCallback(
|
||||
(text: string, searchQuery: string): React.ReactNode => {
|
||||
if (!searchQuery || !highlightSearch) return text;
|
||||
if (!searchQuery || !highlightSearch) {
|
||||
return text;
|
||||
}
|
||||
|
||||
try {
|
||||
const parts = text.split(
|
||||
@@ -257,8 +259,12 @@ const CustomSelect: React.FC<CustomSelectProps> = ({
|
||||
<CloseOutlined
|
||||
onClick={(e): void => {
|
||||
e.stopPropagation();
|
||||
if (onChange) onChange(undefined, []);
|
||||
if (onClear) onClear();
|
||||
if (onChange) {
|
||||
onChange(undefined, []);
|
||||
}
|
||||
if (onClear) {
|
||||
onClear();
|
||||
}
|
||||
}}
|
||||
/>
|
||||
),
|
||||
@@ -280,7 +286,9 @@ const CustomSelect: React.FC<CustomSelectProps> = ({
|
||||
setActiveOptionIndex(0);
|
||||
}
|
||||
|
||||
if (onSearch) onSearch(trimmedValue);
|
||||
if (onSearch) {
|
||||
onSearch(trimmedValue);
|
||||
}
|
||||
},
|
||||
[onSearch, isOpen],
|
||||
);
|
||||
@@ -301,7 +309,9 @@ const CustomSelect: React.FC<CustomSelectProps> = ({
|
||||
if (isOpen) {
|
||||
// Get flattened list of all selectable options
|
||||
const getFlatOptions = (): OptionData[] => {
|
||||
if (!filteredOptions) return [];
|
||||
if (!filteredOptions) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const flatList: OptionData[] = [];
|
||||
|
||||
|
||||
@@ -204,7 +204,9 @@ describe('CustomMultiSelect Component', () => {
|
||||
|
||||
options.forEach((option) => {
|
||||
const text = option.textContent || '';
|
||||
if (text.includes('Option 2')) foundOption2 = true;
|
||||
if (text.includes('Option 2')) {
|
||||
foundOption2 = true;
|
||||
}
|
||||
});
|
||||
|
||||
expect(foundOption2).toBe(true);
|
||||
|
||||
@@ -26,7 +26,9 @@ export const prioritizeOrAddOptionForSingleSelect = (
|
||||
(subOption) => subOption.value === value,
|
||||
);
|
||||
|
||||
if (extractedOption) foundOption = extractedOption;
|
||||
if (extractedOption) {
|
||||
foundOption = extractedOption;
|
||||
}
|
||||
|
||||
// Keep the group if it still has remaining options
|
||||
return remainingSubOptions.length > 0
|
||||
@@ -115,7 +117,9 @@ export const filterOptionsBySearch = (
|
||||
options: OptionData[],
|
||||
searchText: string,
|
||||
): OptionData[] => {
|
||||
if (!searchText.trim()) return options;
|
||||
if (!searchText.trim()) {
|
||||
return options;
|
||||
}
|
||||
|
||||
const lowerSearchText = searchText.toLowerCase();
|
||||
|
||||
|
||||
@@ -25,10 +25,14 @@ function mockOverflow(clientWidth: number, scrollWidth: number): void {
|
||||
function queryTooltipInner(): HTMLElement | null {
|
||||
// find element that has role="tooltip" (could be the inner itself)
|
||||
const tooltip = document.querySelector<HTMLElement>('[role="tooltip"]');
|
||||
if (!tooltip) return document.querySelector(TOOLTIP_INNER_SELECTOR);
|
||||
if (!tooltip) {
|
||||
return document.querySelector(TOOLTIP_INNER_SELECTOR);
|
||||
}
|
||||
|
||||
// if the role element is already the inner, return it; otherwise return its descendant
|
||||
if (tooltip.classList.contains('ant-tooltip-inner')) return tooltip;
|
||||
if (tooltip.classList.contains('ant-tooltip-inner')) {
|
||||
return tooltip;
|
||||
}
|
||||
return (
|
||||
(tooltip.querySelector(TOOLTIP_INNER_SELECTOR) as HTMLElement) ??
|
||||
document.querySelector(TOOLTIP_INNER_SELECTOR)
|
||||
@@ -52,7 +56,9 @@ describe('OverflowInputToolTip', () => {
|
||||
});
|
||||
|
||||
const tooltipInner = queryTooltipInner();
|
||||
if (!tooltipInner) throw new Error('Tooltip inner not found');
|
||||
if (!tooltipInner) {
|
||||
throw new Error('Tooltip inner not found');
|
||||
}
|
||||
expect(
|
||||
within(tooltipInner).getByText('Very long overflowing text'),
|
||||
).toBeInTheDocument();
|
||||
|
||||
@@ -157,7 +157,9 @@ function HavingFilter({
|
||||
|
||||
// Helper to check if we're after an operator
|
||||
const isAfterOperator = (tokens: string[]): boolean => {
|
||||
if (tokens.length === 0) return false;
|
||||
if (tokens.length === 0) {
|
||||
return false;
|
||||
}
|
||||
const lastToken = tokens[tokens.length - 1];
|
||||
// Check if the last token is exactly an operator or ends with an operator and space
|
||||
return havingOperators.some((op) => {
|
||||
|
||||
@@ -84,25 +84,33 @@ function getFunctionContextAtCursor(
|
||||
let funcName: string | null = null;
|
||||
let parenStack = 0;
|
||||
for (let i = cursorPos - 1; i >= 0; i--) {
|
||||
if (text[i] === ')') parenStack++;
|
||||
else if (text[i] === '(') {
|
||||
if (text[i] === ')') {
|
||||
parenStack++;
|
||||
} else if (text[i] === '(') {
|
||||
if (parenStack === 0) {
|
||||
openParenIndex = i;
|
||||
const before = text.slice(0, i);
|
||||
const match = before.match(/(\w+)\s*$/);
|
||||
if (match) funcName = match[1].toLowerCase();
|
||||
if (match) {
|
||||
funcName = match[1].toLowerCase();
|
||||
}
|
||||
break;
|
||||
}
|
||||
parenStack--;
|
||||
}
|
||||
}
|
||||
if (openParenIndex === -1 || !funcName) return null;
|
||||
if (openParenIndex === -1 || !funcName) {
|
||||
return null;
|
||||
}
|
||||
// Scan forwards to find the matching closing parenthesis
|
||||
let closeParenIndex = -1;
|
||||
let depth = 1;
|
||||
for (let j = openParenIndex + 1; j < text.length; j++) {
|
||||
if (text[j] === '(') depth++;
|
||||
else if (text[j] === ')') depth--;
|
||||
if (text[j] === '(') {
|
||||
depth++;
|
||||
} else if (text[j] === ')') {
|
||||
depth--;
|
||||
}
|
||||
if (depth === 0) {
|
||||
closeParenIndex = j;
|
||||
break;
|
||||
@@ -277,10 +285,14 @@ function QueryAggregationSelect({
|
||||
|
||||
// Transaction filter to limit aggregations
|
||||
const transactionFilterExtension = useMemo(() => {
|
||||
if (maxAggregations === undefined) return [];
|
||||
if (maxAggregations === undefined) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return EditorState.transactionFilter.of((tr: Transaction) => {
|
||||
if (!tr.docChanged) return tr;
|
||||
if (!tr.docChanged) {
|
||||
return tr;
|
||||
}
|
||||
|
||||
const regex = /([a-zA-Z_][\w]*)\s*\(([^)]*)\)/g;
|
||||
const oldMatches = [
|
||||
@@ -425,7 +437,9 @@ function QueryAggregationSelect({
|
||||
() =>
|
||||
Object.keys(aggregateAttributeData?.data.data.keys || {}).flatMap((key) => {
|
||||
const attributeKeys = aggregateAttributeData?.data.data.keys[key];
|
||||
if (!attributeKeys) return [];
|
||||
if (!attributeKeys) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return attributeKeys.map((attributeKey) => ({
|
||||
label: attributeKey.name,
|
||||
@@ -482,7 +496,9 @@ function QueryAggregationSelect({
|
||||
const start = match.index ?? 0;
|
||||
return cursorPos >= start && cursorPos <= start + match[0].length;
|
||||
});
|
||||
if (!isEditing) return null;
|
||||
if (!isEditing) {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -523,7 +539,9 @@ function QueryAggregationSelect({
|
||||
const argsString = doc.slice(lastOpenParen + 1, cursorPos);
|
||||
argsString.split(',').forEach((arg) => {
|
||||
const trimmed = arg.trim();
|
||||
if (trimmed) usedArgs.add(trimmed);
|
||||
if (trimmed) {
|
||||
usedArgs.add(trimmed);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -31,7 +31,9 @@ function TraceOperatorSection({
|
||||
]);
|
||||
|
||||
const traceOperatorWarning = useMemo(() => {
|
||||
if (currentQuery.builder.queryData.length === 0) return '';
|
||||
if (currentQuery.builder.queryData.length === 0) {
|
||||
return '';
|
||||
}
|
||||
const firstQuery = currentQuery.builder.queryData[0];
|
||||
return `Currently, you are only seeing results from query ${firstQuery.queryName}. Add a trace operator to combine results of multiple queries.`;
|
||||
}, [currentQuery]);
|
||||
|
||||
@@ -29,7 +29,6 @@ import {
|
||||
QUERY_BUILDER_OPERATORS_BY_KEY_TYPE,
|
||||
queryOperatorSuggestions,
|
||||
} from 'constants/antlrQueryConstants';
|
||||
import { useQueryBuilder } from 'hooks/queryBuilder/useQueryBuilder';
|
||||
import { useIsDarkMode } from 'hooks/useDarkMode';
|
||||
import useDebounce from 'hooks/useDebounce';
|
||||
import { debounce, isNull } from 'lodash-es';
|
||||
@@ -135,10 +134,14 @@ function QuerySearch({
|
||||
const updateEditorValue = useCallback(
|
||||
(value: string, options: { skipOnChange?: boolean } = {}): void => {
|
||||
const view = editorRef.current;
|
||||
if (!view) return;
|
||||
if (!view) {
|
||||
return;
|
||||
}
|
||||
|
||||
const currentValue = view.state.doc.toString();
|
||||
if (currentValue === value) return;
|
||||
if (currentValue === value) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (options.skipOnChange) {
|
||||
isProgrammaticChangeRef.current = true;
|
||||
@@ -165,7 +168,9 @@ function QuerySearch({
|
||||
|
||||
useEffect(
|
||||
() => {
|
||||
if (!isEditorReady) return;
|
||||
if (!isEditorReady) {
|
||||
return;
|
||||
}
|
||||
|
||||
const newExpression = queryData.filter?.expression || '';
|
||||
const currentExpression = getCurrentExpression();
|
||||
@@ -202,8 +207,6 @@ function QuerySearch({
|
||||
const lastValueRef = useRef<string>('');
|
||||
const isMountedRef = useRef<boolean>(true);
|
||||
|
||||
const { handleRunQuery } = useQueryBuilder();
|
||||
|
||||
const { selectedDashboard } = useDashboard();
|
||||
|
||||
const dynamicVariables = useMemo(
|
||||
@@ -236,7 +239,9 @@ function QuerySearch({
|
||||
const toggleSuggestions = useCallback(
|
||||
(timeout?: number) => {
|
||||
const timeoutId = setTimeout(() => {
|
||||
if (!editorRef.current) return;
|
||||
if (!editorRef.current) {
|
||||
return;
|
||||
}
|
||||
if (isFocused) {
|
||||
startCompletion(editorRef.current);
|
||||
} else {
|
||||
@@ -281,7 +286,9 @@ function QuerySearch({
|
||||
options.forEach((opt) => merged.set(opt.label, opt));
|
||||
if (searchText && lastKeyRef.current !== searchText) {
|
||||
(keySuggestions || []).forEach((opt) => {
|
||||
if (!merged.has(opt.label)) merged.set(opt.label, opt);
|
||||
if (!merged.has(opt.label)) {
|
||||
merged.set(opt.label, opt);
|
||||
}
|
||||
});
|
||||
}
|
||||
setKeySuggestions(Array.from(merged.values()));
|
||||
@@ -346,7 +353,9 @@ function QuerySearch({
|
||||
|
||||
// Helper function to check if operator is for list operations (IN, NOT IN, etc.)
|
||||
const isListOperator = (op: string | undefined): boolean => {
|
||||
if (!op) return false;
|
||||
if (!op) {
|
||||
return false;
|
||||
}
|
||||
return op.toUpperCase() === 'IN' || op.toUpperCase() === 'NOT IN';
|
||||
};
|
||||
|
||||
@@ -400,8 +409,9 @@ function QuerySearch({
|
||||
!key ||
|
||||
(key === activeKey && !isLoadingSuggestions && !fetchingComplete) ||
|
||||
!isMountedRef.current
|
||||
)
|
||||
) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Set loading state and store the key we're fetching for
|
||||
setIsLoadingSuggestions(true);
|
||||
@@ -538,7 +548,9 @@ function QuerySearch({
|
||||
);
|
||||
|
||||
const handleUpdate = useCallback((viewUpdate: { view: EditorView }): void => {
|
||||
if (!isMountedRef.current) return;
|
||||
if (!isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!editorRef.current) {
|
||||
editorRef.current = viewUpdate.view;
|
||||
@@ -582,13 +594,21 @@ function QuerySearch({
|
||||
| 'bracketList'
|
||||
| null = null;
|
||||
|
||||
if (context.isInKey) newContextType = 'key';
|
||||
else if (context.isInOperator) newContextType = 'operator';
|
||||
else if (context.isInValue) newContextType = 'value';
|
||||
else if (context.isInConjunction) newContextType = 'conjunction';
|
||||
else if (context.isInFunction) newContextType = 'function';
|
||||
else if (context.isInParenthesis) newContextType = 'parenthesis';
|
||||
else if (context.isInBracketList) newContextType = 'bracketList';
|
||||
if (context.isInKey) {
|
||||
newContextType = 'key';
|
||||
} else if (context.isInOperator) {
|
||||
newContextType = 'operator';
|
||||
} else if (context.isInValue) {
|
||||
newContextType = 'value';
|
||||
} else if (context.isInConjunction) {
|
||||
newContextType = 'conjunction';
|
||||
} else if (context.isInFunction) {
|
||||
newContextType = 'function';
|
||||
} else if (context.isInParenthesis) {
|
||||
newContextType = 'parenthesis';
|
||||
} else if (context.isInBracketList) {
|
||||
newContextType = 'bracketList';
|
||||
}
|
||||
|
||||
setQueryContext(context);
|
||||
|
||||
@@ -637,7 +657,9 @@ function QuerySearch({
|
||||
|
||||
// Helper function to render a badge for the current context mode
|
||||
const renderContextBadge = (): JSX.Element => {
|
||||
if (!editingMode) return <Tag>Unknown</Tag>;
|
||||
if (!editingMode) {
|
||||
return <Tag>Unknown</Tag>;
|
||||
}
|
||||
|
||||
switch (editingMode) {
|
||||
case 'key':
|
||||
@@ -665,7 +687,9 @@ function QuerySearch({
|
||||
// This matches words before the cursor position
|
||||
// eslint-disable-next-line no-useless-escape
|
||||
const word = context.matchBefore(/[a-zA-Z0-9_.:/?&=#%\-\[\]]*/);
|
||||
if (word?.from === word?.to && !context.explicit) return null;
|
||||
if (word?.from === word?.to && !context.explicit) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Get current query from editor
|
||||
const currentExpression = getCurrentExpression();
|
||||
@@ -1228,7 +1252,9 @@ function QuerySearch({
|
||||
}, [isFocused, toggleSuggestions]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!queryContext) return;
|
||||
if (!queryContext) {
|
||||
return;
|
||||
}
|
||||
// Trigger suggestions based on context
|
||||
if (editorRef.current) {
|
||||
toggleSuggestions(10);
|
||||
|
||||
@@ -87,7 +87,7 @@ function TraceOperatorEditor({
|
||||
// Track if the query was changed externally (from props) vs internally (user input)
|
||||
const [isExternalQueryChange, setIsExternalQueryChange] = useState(false);
|
||||
const [lastExternalValue, setLastExternalValue] = useState<string>('');
|
||||
const { currentQuery, handleRunQuery } = useQueryBuilder();
|
||||
const { currentQuery } = useQueryBuilder();
|
||||
|
||||
const queryOptions = useMemo(
|
||||
() =>
|
||||
@@ -104,7 +104,9 @@ function TraceOperatorEditor({
|
||||
const toggleSuggestions = useCallback(
|
||||
(timeout?: number) => {
|
||||
const timeoutId = setTimeout(() => {
|
||||
if (!editorRef.current) return;
|
||||
if (!editorRef.current) {
|
||||
return;
|
||||
}
|
||||
if (isFocused) {
|
||||
startCompletion(editorRef.current);
|
||||
} else {
|
||||
@@ -152,7 +154,9 @@ function TraceOperatorEditor({
|
||||
// This matches words before the cursor position
|
||||
// eslint-disable-next-line no-useless-escape
|
||||
const word = context.matchBefore(/[a-zA-Z0-9_.:/?&=#%\-\[\]]*/);
|
||||
if (word?.from === word?.to && !context.explicit) return null;
|
||||
if (word?.from === word?.to && !context.explicit) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Get the trace operator context at the cursor position
|
||||
const queryContext = getTraceOperatorContextAtCursor(value, cursorPos.ch);
|
||||
|
||||
@@ -375,7 +375,9 @@ export function getTraceOperatorContextAtCursor(
|
||||
let lastTokenBeforeCursor: IToken | null = null;
|
||||
for (let i = 0; i < allTokens.length; i++) {
|
||||
const token = allTokens[i];
|
||||
if (token.type === TraceOperatorGrammarLexer.EOF) continue;
|
||||
if (token.type === TraceOperatorGrammarLexer.EOF) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (token.stop < cursorIndex || token.stop + 1 === cursorIndex) {
|
||||
lastTokenBeforeCursor = token;
|
||||
@@ -390,7 +392,9 @@ export function getTraceOperatorContextAtCursor(
|
||||
let exactToken: IToken | null = null;
|
||||
for (let i = 0; i < allTokens.length; i++) {
|
||||
const token = allTokens[i];
|
||||
if (token.type === TraceOperatorGrammarLexer.EOF) continue;
|
||||
if (token.type === TraceOperatorGrammarLexer.EOF) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (token.start <= cursorIndex && cursorIndex <= token.stop + 1) {
|
||||
exactToken = token;
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user