Skip to content

Conversation

@mergify
Copy link
Contributor

@mergify mergify bot commented Mar 17, 2025

Release notes

[rn:skip]

What does this PR do?

  • Upgrades es-ruby client, especially moves to new elastic-transport ruby client.
  • Because in 8.x there is elasticsearch_client.rb used for modules, we need to update separately.

Also check the plugins which use older elasticsearch-transport client

On 8.x elasticsearch-ruby client is used in modules (in 9.x it was removed with #16794). So, this PR also addresses the changes need in wrapped elasticsearch_client.rb.

Why is it important/What is the impact to the user?

No user impact

Checklist

  • My code follows the style guidelines of this project
  • I have commented my code, particularly in hard-to-understand areas
  • [ ] I have made corresponding changes to the documentation
  • [ ] I have made corresponding change to the default configuration files (and/or docker env variables)
  • I have added tests that prove my fix is effective or that my feature works

Author's Checklist

How to test this PR locally

Related issues

Use cases

Screenshots

Logs

  • elasticsearch.conf config
input {
    elasticsearch {
        cloud_id => "my-cloud-id"
        api_key => "my-cloud:api-key"
    }
}

filter {
    elasticsearch {
        cloud_id => "my-cloud-id"
        api_key => "my-cloud:api-key"
        query => "type:start AND operation:%{[opid]}"
        fields => { "@timestamp" => "started"}
    }
}

output {
    stdout {
        codec => rubydebug
    }
}
  • Logs
➜  logstash git:(es-ruby-client-upgrade) ✗ bin/logstash -f config/elasticsearch.conf --enable-local-plugin-development --log.level=trace
Using system java: /Users/mashhur/.sdkman/candidates/java/current/bin/java
Sending Logstash logs to /Users/mashhur/Dev/elastic/logstash/logs which is now configured via log4j2.properties
[2025-03-03T23:26:21,145][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/mashhur/Dev/elastic/logstash/config/log4j2.properties
[2025-03-03T23:26:21,147][WARN ][logstash.runner          ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2025-03-03T23:26:21,148][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"9.1.0", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.5+11-LTS on 21.0.5+11-LTS +indy +jit [arm64-darwin]"}
[2025-03-03T23:26:21,148][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-03-03T23:26:21,164][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000` (logstash default)
[2025-03-03T23:26:21,164][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000` (logstash default)
[2025-03-03T23:26:21,164][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-nesting-depth` configured to `1000` (logstash default)
[2025-03-03T23:26:21,168][DEBUG][logstash.runner          ] Setting global FieldReference escape style: none
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] allow_superuser: false
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] node.name: Mashhurs-MacBook-Pro.local
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] *path.config: config/input-elasticsearch.conf
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] path.data: /Users/mashhur/Dev/elastic/logstash/data
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] *config.string: null
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] config.test_and_exit: false
[2025-03-03T23:26:21,174][DEBUG][logstash.runner          ] config.reload.automatic: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] config.reload.interval: TimeValue{duration=3, timeUnit=SECONDS}
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] config.support_escapes: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] config.field_reference.escape_style: none
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] metric.collect: true
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.id: main
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.system: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.workers: 10
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.ordered: auto
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] pipeline.ecs_compatibility: v8
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] path.plugins: []
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] *interactive: null
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] config.debug: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] *log.level: trace (default: info)
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] version: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] help: false
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] *enable-local-plugin-development: true (default: false)
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] log.format: plain
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] log.format.json.fix_duplicate_message_fields: true
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] api.enabled: true
[2025-03-03T23:26:21,175][DEBUG][logstash.runner          ] api.http.host: 127.0.0.1
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.http.port: 9600..9700
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.environment: production
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.type: none
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] *api.auth.basic.username: null
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] *api.auth.basic.password: null
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.basic.password_policy.mode: WARN
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.basic.password_policy.length.minimum: 8
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.upper: REQUIRED
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.lower: REQUIRED
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.digit: REQUIRED
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.auth.basic.password_policy.include.symbol: OPTIONAL
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.ssl.enabled: false
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] *api.ssl.keystore.path: null
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] *api.ssl.keystore.password: null
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] api.ssl.supported_protocols: []
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] queue.type: memory
[2025-03-03T23:26:21,176][DEBUG][logstash.runner          ] queue.drain: false
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.max_events: 0
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] queue.checkpoint.retry: true
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] dead_letter_queue.flush_interval: 5000
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] dead_letter_queue.storage_policy: drop_newer
[2025-03-03T23:26:21,179][DEBUG][logstash.runner          ] *dead_letter_queue.retain.age: null
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] slowlog.threshold.warn: TimeValue{duration=-1, timeUnit=NANOSECONDS}
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] slowlog.threshold.info: TimeValue{duration=-1, timeUnit=NANOSECONDS}
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] slowlog.threshold.debug: TimeValue{duration=-1, timeUnit=NANOSECONDS}
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] slowlog.threshold.trace: TimeValue{duration=-1, timeUnit=NANOSECONDS}
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] keystore.classname: org.logstash.secret.store.backend.JavaKeyStore
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] keystore.file: /Users/mashhur/Dev/elastic/logstash/config/logstash.keystore
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] *monitoring.cluster_uuid: null
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] pipeline.buffer.type: heap
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] path.queue: /Users/mashhur/Dev/elastic/logstash/data/queue
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] path.dead_letter_queue: /Users/mashhur/Dev/elastic/logstash/data/dead_letter_queue
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] path.settings: /Users/mashhur/Dev/elastic/logstash/config
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] path.logs: /Users/mashhur/Dev/elastic/logstash/logs
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] xpack.geoip.downloader.endpoint: https://geoip.elastic.co/v1/database
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] xpack.geoip.download.endpoint: https://geoip.elastic.co/v1/database
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] xpack.geoip.downloader.poll.interval: TimeValue{duration=24, timeUnit=HOURS}
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] xpack.geoip.downloader.enabled: true
[2025-03-03T23:26:21,180][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: TimeValue{duration=5, timeUnit=SECONDS}
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: logstash_system
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.password: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.cloud_id: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.cloud_auth: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.api_key: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.proxy: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.certificate_authority: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.ca_trusted_fingerprint: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.truststore.path: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.truststore.password: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.keystore.path: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.keystore.password: null
[2025-03-03T23:26:21,181][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.certificate: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.management.elasticsearch.ssl.key: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.cipher_suites: []
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: full
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.monitoring.allow_legacy_collection: false
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: TimeValue{duration=10, timeUnit=SECONDS}
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: TimeValue{duration=10, timeUnit=MINUTES}
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: logstash_system
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.password: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.proxy: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.cloud_id: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.cloud_auth: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.api_key: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.certificate_authority: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.ca_trusted_fingerprint: null
[2025-03-03T23:26:21,182][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.truststore.path: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.truststore.password: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.keystore.path: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.keystore.password: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: full
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.certificate: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *xpack.monitoring.elasticsearch.ssl.key: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.cipher_suites: []
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] monitoring.enabled: false
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] monitoring.collection.interval: TimeValue{duration=10, timeUnit=SECONDS}
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] monitoring.collection.timeout_interval: TimeValue{duration=10, timeUnit=MINUTES}
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] monitoring.elasticsearch.username: logstash_system
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *monitoring.elasticsearch.password: null
[2025-03-03T23:26:21,184][DEBUG][logstash.runner          ] *monitoring.elasticsearch.proxy: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.cloud_id: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.cloud_auth: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.api_key: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.certificate_authority: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.ca_trusted_fingerprint: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.truststore.path: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.truststore.password: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.keystore.path: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.keystore.password: null
[2025-03-03T23:26:21,185][DEBUG][logstash.runner          ] monitoring.elasticsearch.ssl.verification_mode: full
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.certificate: null
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] *monitoring.elasticsearch.ssl.key: null
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] monitoring.elasticsearch.ssl.cipher_suites: []
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] monitoring.elasticsearch.sniffing: false
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] monitoring.collection.pipeline.details.enabled: true
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] monitoring.collection.config.enabled: true
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] node.uuid: 
[2025-03-03T23:26:21,186][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2025-03-03T23:26:21,186][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because command line options are specified
[2025-03-03T23:26:21,193][DEBUG][org.logstash.health.MultiIndicator] attached indicator pipelines=>MultiIndicator{indicators={}} (res:MultiIndicator{indicators={pipelines=MultiIndicator{indicators={}}}})
[2025-03-03T23:26:21,202][DEBUG][logstash.agent           ] Initializing API WebServer {"api.http.host"=>"127.0.0.1", "api.http.port"=>9600..9700, "api.ssl.enabled"=>false, "api.auth.type"=>"none", "api.environment"=>"production"}
[2025-03-03T23:26:21,206][DEBUG][logstash.api.service     ] [api-service] start
[2025-03-03T23:26:21,217][DEBUG][logstash.agent           ] Setting up metric collection
[2025-03-03T23:26:21,218][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2025-03-03T23:26:21,218][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2025-03-03T23:26:21,226][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2025-03-03T23:26:21,247][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Young Generation"}
[2025-03-03T23:26:21,248][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Concurrent GC"}
[2025-03-03T23:26:21,248][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"G1 Old Generation"}
[2025-03-03T23:26:21,250][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2025-03-03T23:26:21,251][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2025-03-03T23:26:21,251][DEBUG][logstash.instrument.periodicpoller.flowrate] Starting {:polling_interval=>5, :polling_timeout=>120}
[2025-03-03T23:26:21,257][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(input_throughput) baseline -> FlowCapture{nanoTimestamp=535961513103791 numerator=0.0 denominator=0.002349917}
[2025-03-03T23:26:21,260][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(filter_throughput) baseline -> FlowCapture{nanoTimestamp=535961516448083 numerator=0.0 denominator=0.005857584}
[2025-03-03T23:26:21,260][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(output_throughput) baseline -> FlowCapture{nanoTimestamp=535961516762000 numerator=0.0 denominator=0.006171959}
[2025-03-03T23:26:21,262][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(queue_backpressure) baseline -> FlowCapture{nanoTimestamp=535961518063166 numerator=0.0 denominator=7.473042}
[2025-03-03T23:26:21,262][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_concurrency) baseline -> FlowCapture{nanoTimestamp=535961518269583 numerator=0.0 denominator=7.679709}
[2025-03-03T23:26:21,428][DEBUG][logstash.agent           ] Starting agent
[2025-03-03T23:26:21,429][DEBUG][logstash.agent           ] Starting API WebServer (puma)
[2025-03-03T23:26:21,431][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/Users/mashhur/Dev/elastic/logstash/config/aws-integration.conf", "/Users/mashhur/Dev/elastic/logstash/config/azure-event-hubs.conf", "/Users/mashhur/Dev/elastic/logstash/config/backpressure.conf", "/Users/mashhur/Dev/elastic/logstash/config/backpressure_test.conf", "/Users/mashhur/Dev/elastic/logstash/config/beats-and-tcp.conf", "/Users/mashhur/Dev/elastic/logstash/config/cert.pem", "/Users/mashhur/Dev/elastic/logstash/config/elastic_agent.conf", "/Users/mashhur/Dev/elastic/logstash/config/elastic_integration.conf", "/Users/mashhur/Dev/elastic/logstash/config/elastic_integration_serverless.conf", "/Users/mashhur/Dev/elastic/logstash/config/elastic_integration_simple.conf", "/Users/mashhur/Dev/elastic/logstash/config/elastic_integration_winlogbeat.conf", "/Users/mashhur/Dev/elastic/logstash/config/email.conf", "/Users/mashhur/Dev/elastic/logstash/config/encoding_test.conf", "/Users/mashhur/Dev/elastic/logstash/config/env-test.conf", "/Users/mashhur/Dev/elastic/logstash/config/env-var-test.conf", "/Users/mashhur/Dev/elastic/logstash/config/es-template", "/Users/mashhur/Dev/elastic/logstash/config/failure_injector-test.conf", "/Users/mashhur/Dev/elastic/logstash/config/file-input.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter-decrypt.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter-drop.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter-mutate.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter-ruby.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter_dns.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter_http.conf", "/Users/mashhur/Dev/elastic/logstash/config/filter_useragent.conf", "/Users/mashhur/Dev/elastic/logstash/config/fingerprint.conf", "/Users/mashhur/Dev/elastic/logstash/config/geoip", "/Users/mashhur/Dev/elastic/logstash/config/github.conf", "/Users/mashhur/Dev/elastic/logstash/config/grok", "/Users/mashhur/Dev/elastic/logstash/config/grpc-input.conf", "/Users/mashhur/Dev/elastic/logstash/config/hashid.conf", "/Users/mashhur/Dev/elastic/logstash/config/health_report", "/Users/mashhur/Dev/elastic/logstash/config/http-input.conf", "/Users/mashhur/Dev/elastic/logstash/config/http-output.conf", "/Users/mashhur/Dev/elastic/logstash/config/imap.conf", "/Users/mashhur/Dev/elastic/logstash/config/input-generator.config", "/Users/mashhur/Dev/elastic/logstash/config/input-heartbeat.conf", "/Users/mashhur/Dev/elastic/logstash/config/input-tcp-and-http.conf", "/Users/mashhur/Dev/elastic/logstash/config/input-tcp.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_jdbc_mssql.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_jdbc_mssql_non_schedule.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_jdbc_mysql.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_jdbc_oracle.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_jdbc_oracle1.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_jdbc_oracle2.conf", "/Users/mashhur/Dev/elastic/logstash/config/input_stdin.conf", "/Users/mashhur/Dev/elastic/logstash/config/inputbeats.conf", "/Users/mashhur/Dev/elastic/logstash/config/jvm.options", "/Users/mashhur/Dev/elastic/logstash/config/kafkain.conf", "/Users/mashhur/Dev/elastic/logstash/config/kafkaout.conf", "/Users/mashhur/Dev/elastic/logstash/config/kinesis.conf", "/Users/mashhur/Dev/elastic/logstash/config/legacy-template.json", "/Users/mashhur/Dev/elastic/logstash/config/log4j2.properties", "/Users/mashhur/Dev/elastic/logstash/config/logstash-input.conf", "/Users/mashhur/Dev/elastic/logstash/config/logstash-output-stdin.conf", "/Users/mashhur/Dev/elastic/logstash/config/logstash-output.conf", "/Users/mashhur/Dev/elastic/logstash/config/logstash-sample.conf", "/Users/mashhur/Dev/elastic/logstash/config/logstash.keystore", "/Users/mashhur/Dev/elastic/logstash/config/logstash.yml", "/Users/mashhur/Dev/elastic/logstash/config/ls-to-ls", "/Users/mashhur/Dev/elastic/logstash/config/memcached-filter.conf", "/Users/mashhur/Dev/elastic/logstash/config/mssql-jdbc-11.2.0.jre17.jar", "/Users/mashhur/Dev/elastic/logstash/config/mssql-jdbc-12.2.0.jre11.jar", "/Users/mashhur/Dev/elastic/logstash/config/mysql-connector-j-8.0.33.jar", "/Users/mashhur/Dev/elastic/logstash/config/output-csv.conf", "/Users/mashhur/Dev/elastic/logstash/config/output-fluent.conf", "/Users/mashhur/Dev/elastic/logstash/config/pipelines.yml", "/Users/mashhur/Dev/elastic/logstash/config/rackspace.conf", "/Users/mashhur/Dev/elastic/logstash/config/rag", "/Users/mashhur/Dev/elastic/logstash/config/redis.conf", "/Users/mashhur/Dev/elastic/logstash/config/s3input.conf", "/Users/mashhur/Dev/elastic/logstash/config/s3output-sdh-1296.conf", "/Users/mashhur/Dev/elastic/logstash/config/s3output.conf", "/Users/mashhur/Dev/elastic/logstash/config/salesforce.conf", "/Users/mashhur/Dev/elastic/logstash/config/sdh", "/Users/mashhur/Dev/elastic/logstash/config/simple-es-v7.conf", "/Users/mashhur/Dev/elastic/logstash/config/simple.conf", "/Users/mashhur/Dev/elastic/logstash/config/snmp_input.conf", "/Users/mashhur/Dev/elastic/logstash/config/snmp_trap.conf", "/Users/mashhur/Dev/elastic/logstash/config/startup.options", "/Users/mashhur/Dev/elastic/logstash/config/stdin.conf", "/Users/mashhur/Dev/elastic/logstash/config/udp_input.conf"]}
[2025-03-03T23:26:21,432][DEBUG][logstash.agent           ] Trying to start API WebServer {:port=>9600, :ssl_enabled=>false}
[2025-03-03T23:26:21,444][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/Users/mashhur/Dev/elastic/logstash/config/input-elasticsearch.conf"}
[2025-03-03T23:26:21,450][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2025-03-03T23:26:21,451][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2025-03-03T23:26:21,452][DEBUG][org.logstash.health.MultiIndicator] detached indicator main<=null (res:MultiIndicator{indicators={}})
[2025-03-03T23:26:21,452][DEBUG][org.logstash.health.HealthObserver] detached pipeline indicator [main]
[2025-03-03T23:26:21,453][DEBUG][org.logstash.health.MultiIndicator] attached indicator main=>ProbeIndicator{observer=org.logstash.health.PipelineIndicator$$Lambda/0x0000007001c7f458@ed10936, probes={flow:worker_utilization=org.logstash.health.PipelineIndicator$FlowWorkerUtilizationProbe@17939c1, status=org.logstash.health.PipelineIndicator$StatusProbe@128a1e5}} (res:MultiIndicator{indicators={main=ProbeIndicator{observer=org.logstash.health.PipelineIndicator$$Lambda/0x0000007001c7f458@ed10936, probes={flow:worker_utilization=org.logstash.health.PipelineIndicator$FlowWorkerUtilizationProbe@17939c1, status=org.logstash.health.PipelineIndicator$StatusProbe@128a1e5}}}})
[2025-03-03T23:26:21,453][DEBUG][org.logstash.health.HealthObserver] attached pipeline indicator [main]
[2025-03-03T23:26:21,455][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2025-03-03T23:26:21,456][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to load or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2025-03-03T23:26:21,480][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-03-03T23:26:21,485][DEBUG][org.logstash.secret.store.backend.JavaKeyStore] retrieved secret urn:logstash:secret:v1:keystore.seed
[2025-03-03T23:26:21,485][DEBUG][org.logstash.secret.store.backend.JavaKeyStore] Using existing keystore at /Users/mashhur/Dev/elastic/logstash/config/logstash.keystore
[2025-03-03T23:26:21,622][INFO ][org.reflections.Reflections] Reflections took 55 ms to scan 1 urls, producing 149 keys and 521 values
[2025-03-03T23:26:21,631][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2025-03-03T23:26:21,631][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to load or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2025-03-03T23:26:21,641][DEBUG][org.logstash.secret.store.backend.JavaKeyStore] retrieved secret urn:logstash:secret:v1:keystore.seed
[2025-03-03T23:26:21,641][DEBUG][org.logstash.secret.store.backend.JavaKeyStore] Using existing keystore at /Users/mashhur/Dev/elastic/logstash/config/logstash.keystore
[2025-03-03T23:26:22,461][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"input", :class=>LogStash::Inputs::Elasticsearch}
[2025-03-03T23:26:22,468][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2025-03-03T23:26:22,472][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_5c94b72b-df88-46d7-867f-ac9cf03ec85f"
[2025-03-03T23:26:22,472][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2025-03-03T23:26:22,472][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@api_key = <password>
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@cloud_id = "initial-817-cluster:dXMtd2VzdC0yLmF3cy5mb3VuZC5pbzo0NDMkZjNkMjliMGM0NTUyNDk1NDlmN2NiMWE4NzZmMjQyZDgkMDFiNzM4NGFjNDY1NDdhYTljYjgxYjE4MGViZDMzOTI="
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@id = "edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479"
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@enable_metric = true
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@codec = <LogStash::Codecs::Plain id=>"plain_5c94b72b-df88-46d7-867f-ac9cf03ec85f", enable_metric=>true, charset=>"UTF-8">
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@add_field = {}
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@index = "logstash-*"
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@query = "{ \"sort\": [ \"_doc\" ] }"
[2025-03-03T23:26:22,476][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@response_type = "hits"
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@size = 1000
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@retries = 0
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@search_api = "auto"
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@scroll = "1m"
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@docinfo = false
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@docinfo_fields = ["_index", "_type", "_id"]
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@custom_headers = {}
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@connect_timeout_seconds = 10
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@request_timeout_seconds = 60
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@socket_timeout_seconds = 60
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@ssl_supported_protocols = []
[2025-03-03T23:26:22,477][DEBUG][logstash.inputs.elasticsearch] config LogStash::Inputs::Elasticsearch/@ssl_verification_mode = "full"
[2025-03-03T23:26:22,485][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"filter", :class=>LogStash::Filters::Elasticsearch}
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@cloud_id = "initial-817-cluster:dXMtd2VzdC0yLmF3cy5mb3VuZC5pbzo0NDMkZjNkMjliMGM0NTUyNDk1NDlmN2NiMWE4NzZmMjQyZDgkMDFiNzM4NGFjNDY1NDdhYTljYjgxYjE4MGViZDMzOTI="
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@id = "e5c9e70312257dcd89442c55f9257847fea680ccdeb3141449325e93d13a600f"
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@fields = {"@timestamp"=>"started"}
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@api_key = <password>
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@query = "type:start AND operation:%{[opid]}"
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@enable_metric = true
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@add_tag = []
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@remove_tag = []
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@add_field = {}
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@remove_field = []
[2025-03-03T23:26:22,494][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@periodic_flush = false
[2025-03-03T23:26:22,495][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@hosts = ["localhost:9200"]
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@index = ""
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@sort = "@timestamp:desc"
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@custom_headers = {}
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@docinfo_fields = {}
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@aggregation_fields = {}
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@ssl_supported_protocols = []
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@ssl_verification_mode = "full"
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@enable_sort = true
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@result_size = 1
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@tag_on_failure = ["_elasticsearch_lookup_failure"]
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@retry_on_failure = 0
[2025-03-03T23:26:22,496][DEBUG][logstash.filters.elasticsearch] config LogStash::Filters::Elasticsearch/@retry_on_status = [500, 502, 503, 504]
[2025-03-03T23:26:22,498][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2025-03-03T23:26:22,503][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2025-03-03T23:26:22,508][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_cbcceab4-76b4-4fd8-ae48-4d62d4ca80a8"
[2025-03-03T23:26:22,508][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2025-03-03T23:26:22,508][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2025-03-03T23:26:22,548][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_cbcceab4-76b4-4fd8-ae48-4d62d4ca80a8", enable_metric=>true, metadata=>false>
[2025-03-03T23:26:22,548][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "b81336e386371df23c4d073056ae47dfc482053186a96c3cac332bb5c3511586"
[2025-03-03T23:26:22,549][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2025-03-03T23:26:22,549][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2025-03-03T23:26:22,554][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2025-03-03T23:26:22,555][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(input_throughput) baseline -> FlowCapture{nanoTimestamp=535962811478791 numerator=0.0 denominator=0.000131959}
[2025-03-03T23:26:22,556][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `input_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-03T23:26:22,556][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(filter_throughput) baseline -> FlowCapture{nanoTimestamp=535962812182791 numerator=0.0 denominator=0.000836625}
[2025-03-03T23:26:22,556][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `filter_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-03T23:26:22,556][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(output_throughput) baseline -> FlowCapture{nanoTimestamp=535962812405625 numerator=0.0 denominator=0.001059667}
[2025-03-03T23:26:22,556][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `output_throughput` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-03T23:26:22,556][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(queue_backpressure) baseline -> FlowCapture{nanoTimestamp=535962812666041 numerator=0.0 denominator=1.320084}
[2025-03-03T23:26:22,557][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `queue_backpressure` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-03T23:26:22,557][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_concurrency) baseline -> FlowCapture{nanoTimestamp=535962813139333 numerator=0.0 denominator=1.793375}
[2025-03-03T23:26:22,557][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_concurrency` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-03T23:26:22,557][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_utilization) baseline -> FlowCapture{nanoTimestamp=535962813502083 numerator=0.0 denominator=21.55584}
[2025-03-03T23:26:22,557][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :flow]`
[2025-03-03T23:26:22,558][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(throughput) baseline -> FlowCapture{nanoTimestamp=535962814092291 numerator=0.0 denominator=0.002746417}
[2025-03-03T23:26:22,558][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `throughput` in namespace `[:stats, :pipelines, :main, :plugins, :inputs, :edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479, :flow]`
[2025-03-03T23:26:22,558][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_millis_per_event) baseline -> FlowCapture{nanoTimestamp=535962814633666 numerator=0.0 denominator=0.0}
[2025-03-03T23:26:22,558][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_utilization) baseline -> FlowCapture{nanoTimestamp=535962814727416 numerator=0.0 denominator=33.8125}
[2025-03-03T23:26:22,559][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :e5c9e70312257dcd89442c55f9257847fea680ccdeb3141449325e93d13a600f, :flow]`
[2025-03-03T23:26:22,559][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :filters, :e5c9e70312257dcd89442c55f9257847fea680ccdeb3141449325e93d13a600f, :flow]`
[2025-03-03T23:26:22,559][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_millis_per_event) baseline -> FlowCapture{nanoTimestamp=535962815095708 numerator=0.0 denominator=0.0}
[2025-03-03T23:26:22,559][TRACE][org.logstash.instrument.metrics.BaseFlowMetric] FlowMetric(worker_utilization) baseline -> FlowCapture{nanoTimestamp=535962815182125 numerator=0.0 denominator=38.36}
[2025-03-03T23:26:22,559][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_millis_per_event` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :b81336e386371df23c4d073056ae47dfc482053186a96c3cac332bb5c3511586, :flow]`
[2025-03-03T23:26:22,559][DEBUG][org.logstash.execution.AbstractPipelineExt] Flow metric registered: `worker_utilization` in namespace `[:stats, :pipelines, :main, :plugins, :outputs, :b81336e386371df23c4d073056ae47dfc482053186a96c3cac332bb5c3511586, :flow]`
[2025-03-03T23:26:22,559][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
[2025-03-03T23:26:22,572][INFO ][logstash.filters.elasticsearch][main] New ElasticSearch filter client {:hosts=>["https://f3d29b0c455249549f7cb1a876f242d8.us-west-2.aws.found.io:443"]}
[2025-03-03T23:26:22,819][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>10, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1250, "pipeline.sources"=>["/Users/mashhur/Dev/elastic/logstash/config/input-elasticsearch.conf"], :thread=>"#<Thread:0x2e7382c5 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-03T23:26:23,092][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.27}
[2025-03-03T23:26:23,181][INFO ][logstash.inputs.elasticsearch][main] `search_api => auto` resolved to `search_after` {:elasticsearch=>"8.17.0"}
[2025-03-03T23:26:23,182][INFO ][logstash.inputs.elasticsearch][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-03-03T23:26:23,182][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-03-03T23:26:23,183][INFO ][logstash.inputs.elasticsearch.searchafter][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Create point in time (PIT)
[2025-03-03T23:26:23,184][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2025-03-03T23:26:23,192][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2e7382c5 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 sleep>"}
[2025-03-03T23:26:23,195][TRACE][logstash.agent           ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Create"]}
[2025-03-03T23:26:23,196][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2025-03-03T23:26:23,196][DEBUG][logstash.pipelineresourceusagevalidator] For a baseline of 2KB events, the maximum heap memory consumed across 1 pipelines may reach up to 0.24% of the entire heap (more if the events are bigger).
[2025-03-03T23:26:23,205][INFO ][logstash.inputs.elasticsearch.searchafter][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Query start
[2025-03-03T23:26:23,205][DEBUG][logstash.inputs.elasticsearch.searchafter][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Query progress
[2025-03-03T23:26:23,205][TRACE][logstash.inputs.elasticsearch.searchafter][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] search options {:size=>1000, :body=>{"sort"=>["_doc"], :pit=>{:id=>"yvaYBAAA", :keep_alive=>"1m"}}}
[2025-03-03T23:26:23,229][INFO ][logstash.inputs.elasticsearch.searchafter][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Query completed
[2025-03-03T23:26:23,230][INFO ][logstash.inputs.elasticsearch.searchafter][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Closing point in time (PIT)
[2025-03-03T23:26:23,250][DEBUG][logstash.inputs.elasticsearch][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Closing {:plugin=>"LogStash::Inputs::Elasticsearch"}
[2025-03-03T23:26:23,251][DEBUG][logstash.pluginmetadata  ][main][edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479] Removing metadata for plugin edb6c95865b83ed2b5bb5044b86eb84cb3e9c7f6965b42978bf19d1016a6f479
[2025-03-03T23:26:23,251][DEBUG][logstash.javapipeline    ][main] Input plugins stopped! Will shutdown filter/output workers. {:pipeline_id=>"main", :thread=>"#<Thread:0x2e7382c5 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-03T23:26:23,252][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x74cdc2f1 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 run>"}
[2025-03-03T23:26:23,312][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x353cd31e /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 run>"}
[2025-03-03T23:26:23,361][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x15dbc6a0 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,361][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x529564bf /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,361][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x4cc16598 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,361][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x2361275b /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,362][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x41e2ba0c /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,372][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x71a05cab /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,372][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x64d8d194 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,372][DEBUG][logstash.javapipeline    ][main] Shutdown waiting for worker thread {:pipeline_id=>"main", :thread=>"#<LogStash::WorkerLoopThread:0x43709b06 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:304 dead>"}
[2025-03-03T23:26:23,372][DEBUG][logstash.filters.elasticsearch][main] Closing {:plugin=>"LogStash::Filters::Elasticsearch"}
[2025-03-03T23:26:23,372][DEBUG][logstash.pluginmetadata  ][main] Removing metadata for plugin e5c9e70312257dcd89442c55f9257847fea680ccdeb3141449325e93d13a600f
[2025-03-03T23:26:23,372][DEBUG][logstash.outputs.stdout  ][main] Closing {:plugin=>"LogStash::Outputs::Stdout"}
[2025-03-03T23:26:23,372][DEBUG][logstash.pluginmetadata  ][main] Removing metadata for plugin b81336e386371df23c4d073056ae47dfc482053186a96c3cac332bb5c3511586
[2025-03-03T23:26:23,372][DEBUG][logstash.javapipeline    ][main] Pipeline has been shutdown {:pipeline_id=>"main", :thread=>"#<Thread:0x2e7382c5 /Users/mashhur/Dev/elastic/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-03-03T23:26:23,373][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2025-03-03T23:26:23,706][DEBUG][logstash.agent           ] Shutting down all pipelines {:pipelines_count=>0}
[2025-03-03T23:26:23,707][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2025-03-03T23:26:23,708][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Delete/pipeline_id:main}
[2025-03-03T23:26:23,709][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2025-03-03T23:26:23,709][DEBUG][org.logstash.health.MultiIndicator] detached indicator main<=null (res:MultiIndicator{indicators={}})
[2025-03-03T23:26:23,709][DEBUG][org.logstash.health.HealthObserver] detached pipeline indicator [main]
[2025-03-03T23:26:23,709][TRACE][logstash.agent           ] Converge results {:success=>true, :failed_actions=>[], :successful_actions=>["id: main, action_type: LogStash::PipelineAction::Delete"]}
[2025-03-03T23:26:23,710][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2025-03-03T23:26:23,710][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2025-03-03T23:26:23,710][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2025-03-03T23:26:23,710][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2025-03-03T23:26:23,710][DEBUG][logstash.instrument.periodicpoller.flowrate] Stopping
[2025-03-03T23:26:23,713][DEBUG][logstash.agent           ] API WebServer has stopped running
[2025-03-03T23:26:23,713][INFO ][logstash.runner          ] Logstash shut down.
➜  logstash git:(es-ruby-client-upgrade) ✗ 


```<hr>This is an automatic backport of pull request #17161 done by [Mergify](https://mergify.com).<hr>This is an automatic backport of pull request #17306 done by [Mergify](https://mergify.com).

* Upgrade elasticsearch-ruby client. (#17161)

* Fix Faraday removed basic auth option and apply the ES client module name change.

(cherry picked from commit e748488)

* Apply the required changes in elasticsearch_client.rb after upgrading the elasticsearch-ruby client to 8.x

* Swallow the exception and make non-connectable client when ES client raises connection refuses exception.

---------

Co-authored-by: Mashhur <[email protected]>
Co-authored-by: Mashhur <[email protected]>
(cherry picked from commit 7f74ce3)
@mashhurs mashhurs requested a review from andsel March 17, 2025 16:22
Copy link
Contributor

@andsel andsel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM when it's green.

Please update the title of PR from

[8.x] Upgrade ...

to

[8.18] Upgrade ...

@mashhurs mashhurs changed the title [8.x] Upgrade elasticsearch-ruby client. (backport #17161) (backport #17306) [8.18] Upgrade elasticsearch-ruby client. (backport #17161) (backport #17306) Mar 17, 2025
@elastic-sonarqube
Copy link

Quality Gate passed Quality Gate passed

Issues
0 New issues
0 Fixed issues
0 Accepted issues

Measures
0 Security Hotspots
No data about Coverage
No data about Duplication

See analysis details on SonarQube

@elasticmachine
Copy link
Collaborator

💚 Build Succeeded

@mashhurs mashhurs merged commit 59bec6e into 8.18 Mar 17, 2025
7 checks passed
@mashhurs mashhurs deleted the mergify/bp/8.18/pr-17306 branch March 17, 2025 17:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants