Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
311f9c0
Doc:Add management and monitoring info to API key docs
karenzone Jul 8, 2020
f40d1fa
[DOCS] Change links to refactored Beats getting started docs
dedemorton Jul 1, 2020
616e600
add dependency notice for amazing_print
jsvd Jul 8, 2020
001cefc
Doc:Replace outdated pipeline viewer screenshot
karenzone Jul 6, 2020
ab69cca
Start fleshing out content
karenzone Jul 9, 2020
0f5cdae
[build] Fix gradle typo
robbavey Jul 13, 2020
1c864b9
Fix kafka setup scripts
robbavey Jun 16, 2020
f27b45d
Doc:Add info on reserved fields in events
karenzone Jul 9, 2020
d706e50
[build] Ensure more gradle tasks using task avoidance API
robbavey Jul 13, 2020
62519ac
monitor worker threads exceptions to not crash logstash, just the fai…
colinsurprenant Jul 2, 2020
2afe60d
fix PipelineRegistry to avoid re-creating a pipeline in the process o…
colinsurprenant Jul 2, 2020
87df15d
ignore default username when no password is set
colinsurprenant Jul 8, 2020
e670cbf
don't call runIntegrationTests from check gradle task
jsvd Jul 15, 2020
1d80d3a
add ci script setup dependencies
jsvd Jul 15, 2020
d9953c6
Doc:Create a new header for integration plugins
karenzone May 12, 2020
25b7d84
Document use of keystore values in pipelines.yml
jsvd Jul 22, 2020
72cff96
initial introduction of .fossa.yml
jsvd Jul 20, 2020
ef4ae81
Fix docker image labels
robbavey Jul 16, 2020
1c3d3b8
reword bin/system_install help text to be less confusing.
yaauie Jul 27, 2020
606e582
Fix formatting possibly causing docs-ci failure
karenzone Jul 28, 2020
44b973d
Merge branch 'api-mon' of https://github.com/karenzone/logstash into …
karenzone Jul 28, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 59 additions & 0 deletions .fossa.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Generated by FOSSA CLI (https://github.com/fossas/fossa-cli)
# Visit https://fossa.com to learn more

version: 2
cli:
server: https://app.fossa.com
fetcher: custom
project: [email protected]:elastic/logstash.git
analyze:
modules:
- name: Logstash gems
type: bundler
strategy: lockfile
target: .
path: .
- name: benchmark-cli
type: gradle
target: 'benchmark-cli:'
path: .
- name: dependencies-report
type: gradle
target: 'dependencies-report:'
path: .
- name: ingest-converter
type: gradle
target: 'ingest-converter:'
path: .
- name: logstash-core
type: gradle
target: 'logstash-core:'
path: .
- name: logstash-core-benchmarks
type: gradle
target: 'logstash-core-benchmarks:'
path: .
- name: logstash-integration-tests
type: gradle
target: 'logstash-integration-tests:'
path: .
- name: logstash-xpack
type: gradle
target: 'logstash-xpack:'
# path: .
# - name: docker
# type: pip
# target: docker
# path: docker
# - name: Gemfile
# type: gem
# target: qa
# path: qa
# - name: Gemfile
# type: gem
# target: qa/integration
# path: qa/integration
# - name: Gemfile
# type: gem
# target: tools/paquet
# path: tools/paquet
4 changes: 2 additions & 2 deletions bin/system-install
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ elif [ "$1" == "-h" ] || [ "$1" == "--help" ]; then
echo
echo "OPTIONSFILE: Full path to a startup.options file"
echo "OPTIONSFILE is required if STARTUPTYPE is specified, but otherwise looks first"
echo "in $LOGSTASH_HOME/config/startup.options and then /etc/logstash/startup.options"
echo "Last match wins"
echo "in /etc/logstash/startup.options and then "
echo "in $LOGSTASH_HOME/config/startup.options "
echo
echo "STARTUPTYPE: e.g. sysv, upstart, systemd, etc."
echo "OPTIONSFILE is required to specify a STARTUPTYPE."
Expand Down
31 changes: 16 additions & 15 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ tasks.register("configureArchitecture") {
String esArch = arch

// For aarch64 architectures, beats and elasticsearch name their artifacts differently
if (arch == "aarch64") {i
if (arch == "aarch64") {
beatsArch="arm64"
esArch="aarch64"
} else if (arch == "amd64") {
Expand Down Expand Up @@ -450,12 +450,9 @@ tasks.register("runIntegrationTests"){
dependsOn tasks.getByPath(":logstash-integration-tests:integrationTests")
dependsOn copyEs
dependsOn copyFilebeat
shouldRunAfter ":logstash-core:test"
}

bootstrap.dependsOn assemblyDeps

runIntegrationTests.shouldRunAfter tasks.getByPath(":logstash-core:test")
check.dependsOn runIntegrationTests


tasks.register("generateLicenseReport", JavaExec) {
Expand Down Expand Up @@ -506,10 +503,13 @@ tasks.register("generatePluginsVersion") {
}

bootstrap.dependsOn assemblyDeps

runIntegrationTests.shouldRunAfter tasks.getByPath(":logstash-core:test")
check.dependsOn runIntegrationTests

// FIXME: adding the integration tests task to check will mean
// that any registered task will be evaluated. This creates an issue
// where the downloadES task may throw an error on versions where
// Elasticsearch doesn't yet have a build we can fetch
// So for now we'll remove this to unblock builds, but finding a way
// to compartimentalize failures is needed going forward
//check.dependsOn runIntegrationTests

Boolean oss = System.getenv('OSS').equals('true')

Expand All @@ -520,11 +520,12 @@ if (!oss) {
dependsOn installTestGems
}
}
tasks.getByPath(":logstash-xpack:rubyIntegrationTests").configure {
dependsOn copyEs
}
}

task runXPackUnitTests(dependsOn: [tasks.getByPath(":logstash-xpack:rubyTests")]) {}
task runXPackIntegrationTests(dependsOn: [tasks.getByPath(":logstash-xpack:rubyIntegrationTests")]) {}
}

tasks.register("runXPackUnitTests"){
dependsOn ":logstash-xpack:rubyTests"
}
tasks.register("runXPackIntegrationTests"){
dependsOn ":logstash-xpack:rubyIntegrationTests"
}
3 changes: 3 additions & 0 deletions ci/bootstrap_dependencies.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
#!/bin/bash

./gradlew installDefaultGems
4 changes: 4 additions & 0 deletions docker/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ HTTPD ?= logstash-docker-artifact-server
FIGLET := pyfiglet -w 160 -f puffy

all: build-from-local-artifacts build-from-local-oss-artifacts public-dockerfiles
DATE:= $(shell date -u +'%Y-%m-%dT%H:%M:%S.%sZ')

lint: venv
flake8 tests
Expand Down Expand Up @@ -70,13 +71,15 @@ docker_paths:

public-dockerfiles: venv templates/Dockerfile.j2 docker_paths $(COPY_FILES)
jinja2 \
-D created_date='$(DATE)' \
-D elastic_version='$(ELASTIC_VERSION)' \
-D version_tag='$(VERSION_TAG)' \
-D image_flavor='full' \
-D local_artifacts='false' \
-D release='$(RELEASE)' \
templates/Dockerfile.j2 > $(ARTIFACTS_DIR)/Dockerfile-full && \
jinja2 \
-D created_date='$(DATE)' \
-D elastic_version='$(ELASTIC_VERSION)' \
-D version_tag='$(VERSION_TAG)' \
-D image_flavor='oss' \
Expand Down Expand Up @@ -133,6 +136,7 @@ env2yaml: golang
dockerfile: venv templates/Dockerfile.j2
$(foreach FLAVOR, $(IMAGE_FLAVORS), \
jinja2 \
-D created_date='$(DATE)' \
-D elastic_version='$(ELASTIC_VERSION)' \
-D version_tag='$(VERSION_TAG)' \
-D image_flavor='$(FLAVOR)' \
Expand Down
17 changes: 11 additions & 6 deletions docker/templates/Dockerfile.j2
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,10 @@

{% if image_flavor == 'oss' -%}
{% set tarball = 'logstash-oss-%s.tar.gz' % elastic_version -%}
{% set license = 'Apache 2.0' -%}
{% else -%}
{% set tarball = 'logstash-%s.tar.gz' % elastic_version -%}
{% set license = 'Elastic License' -%}
{% endif -%}


Expand Down Expand Up @@ -62,17 +64,20 @@ ADD env2yaml/env2yaml /usr/local/bin/
EXPOSE 9600 5044


LABEL org.label-schema.schema-version="1.0" \
LABEL org.label-schema.schema-version="1.0" \
org.label-schema.vendor="Elastic" \
org.opencontainers.image.vendor="Elastic" \
org.label-schema.name="logstash" \
org.opencontainers.image.title="logstash" \
org.label-schema.version="{{ elastic_version }}" \
org.opencontainers.image.version="{{ elastic_version }}" \
org.label-schema.url="https://www.elastic.co/products/logstash" \
org.label-schema.vcs-url="https://github.com/elastic/logstash" \
{% if image_flavor == 'oss' -%}
license="Apache-2.0"
{% else -%}
license="Elastic License"
{% endif -%}
license="{{ license }}" \
org.label-schema.license="{{ license }}" \
org.opencontainers.image.licenses="{{ license }}" \
org.label-schema.build-date={{ created_date }} \
org.opencontainers.image.created={{ created_date }}


ENTRYPOINT ["/usr/local/bin/docker-entrypoint"]
39 changes: 39 additions & 0 deletions docs/include/plugin_header-integration.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
ifeval::["{versioned_docs}"!="true"]
[subs="attributes"]
++++
<titleabbrev>{plugin}</titleabbrev>
++++
endif::[]
ifeval::["{versioned_docs}"=="true"]
[subs="attributes"]
++++
<titleabbrev>{version}</titleabbrev>
++++
endif::[]

* A component of the <<plugins-integrations-{plugin},{plugin} integration plugin>>
* Integration version: {version}
* Released on: {release_date}
* {changelog_url}[Changelog]

ifeval::["{versioned_docs}"!="true"]

For other versions, see the
{lsplugindocs}/{type}-{plugin}-index.html[Versioned plugin docs].

endif::[]

ifeval::["{versioned_docs}"=="true"]

For other versions, see the <<integration-{plugin}-index,overview list>>.

To learn more about Logstash, see the {logstash-ref}/index.html[Logstash Reference].

endif::[]

==== Getting Help

For questions about the plugin, open a topic in the http://discuss.elastic.co[Discuss] forums.
For bugs or feature requests, open an issue in https://github.com/logstash-plugins/logstash-integration-{plugin}[Github].
For the list of Elastic supported plugins, please consult the https://www.elastic.co/support/matrix#matrix_logstash_plugins[Elastic Support Matrix].

4 changes: 2 additions & 2 deletions docs/static/advanced-pipeline.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ input plugin enables Logstash to receive events from the Elastic Beats framework
to work with the Beats framework, such as Packetbeat and Metricbeat, can also send event data to Logstash.

To install Filebeat on your data source machine, download the appropriate package from the Filebeat https://www.elastic.co/downloads/beats/filebeat[product page]. You can also refer to
{filebeat-ref}/filebeat-getting-started.html[Getting Started with Filebeat] in the Beats documentation for additional
{filebeat-ref}/filebeat-installation-configuration.html[Filebeat quick start] for additional
installation instructions.

After installing Filebeat, you need to configure it. Open the `filebeat.yml` file located in your Filebeat installation
Expand Down Expand Up @@ -654,7 +654,7 @@ If you are using Kibana to visualize your data, you can also explore the Filebea

image::static/images/kibana-filebeat-data.png[Discovering Filebeat data in Kibana]

See the {filebeat-ref}/filebeat-getting-started.html[Filebeat getting started docs] for info about loading the Kibana
See the {filebeat-ref}/filebeat-installation-configuration.html[Filebeat quick start docs] for info about loading the Kibana
index pattern for Filebeat.

You've successfully created a pipeline that uses Filebeat to take Apache web logs as input, parses those logs to
Expand Down
4 changes: 2 additions & 2 deletions docs/static/fb-ls-kafka-example.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,8 @@ The `-e` flag is optional and sends output to standard error instead of syslog.
A connection to {es} and {kib} is required for this one-time setup
step because {filebeat} needs to create the index template in {es} and
load the sample dashboards into {kib}. For more information about configuring
the connection to {es}, see the Filebeat modules
{filebeat-ref}/filebeat-modules-quickstart.html[quick start].
the connection to {es}, see the Filebeat
{filebeat-ref}/filebeat-installation-configuration.html[quick start].
+
After the template and dashboards are loaded, you'll see the message `INFO
{kib} dashboards successfully loaded. Loaded dashboards`.
Expand Down
Binary file modified docs/static/monitoring/images/pipeline-tree.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/static/monitoring/monitoring-mb.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ monitoring.cluster_uuid: PRODUCTION_ES_CLUSTER_UUID
[[configure-metricbeat]]
==== Install and configure {metricbeat}

. {metricbeat-ref}/metricbeat-installation.html[Install {metricbeat}] on the
. {metricbeat-ref}/metricbeat-installation-configuration.html[Install {metricbeat}] on the
same server as {ls}.

. Enable the `logstash-xpack` module in {metricbeat}. +
Expand Down
2 changes: 2 additions & 0 deletions docs/static/processing-info.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,5 @@ processing cost required to preserve order.
The Java pipeline initialization time appears in the startup logs at INFO level.
Initialization time is the time it takes to compile the pipeline config and
instantiate the compiled execution for all workers.

include::reserved-fields.asciidoc[]
39 changes: 39 additions & 0 deletions docs/static/reserved-fields.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
[float]
[[reserved-fields]]
==== Reserved fields in {ls} events

Some fields in {ls} events are reserved, or are required to adhere to a certain
shape. Using these fields can cause runtime exceptions when the event API or
plugins encounter incompatible values.

[cols="<,<",options="header",]
|=======================================================================
| |
| <<metadata,`@metadata`>> |A key/value map.

Ruby-based Plugin API: value is an
https://javadoc.io/static/org.jruby/jruby-core/9.2.5.0/org/jruby/RubyHash.html[org.jruby.RubyHash].

Java-based Plugin API: value is an
https://github.com/elastic/logstash/blob/master/logstash-core/src/main/java/org/logstash/ConvertedMap.java[org.logstash.ConvertedMap].

In serialized form (such as JSON): a key/value map where the keys must be
strings and the values are not constrained to a particular type.

| `@timestamp` |An object holding representation of a specific moment in time.

Ruby-based Plugin API: value is an
https://javadoc.io/static/org.jruby/jruby-core/9.2.5.0/org/jruby/RubyTime.html[org.jruby.RubyTime].

Java-based Plugin API: value is a
https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/time/Instant.html[java.time.Instant].

In serialized form (such as JSON) or when setting with Event#set: an
ISO8601-compliant String value is acceptable.

| `@version` |A string, holding an integer value.
| `tags` |An array of distinct strings
|=======================================================================



21 changes: 21 additions & 0 deletions docs/static/security/api-keys.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -176,6 +176,27 @@ filter {
<1> Format is `id:api_key` (as returned by {ref}/security-api-create-api-key.html[Create API key])


[float]
[[ls-api-key-mon]]
====== Create an API key for monitoring and management



/////
API keys are tied to the cluster they were created in.

In a single cluster a key can be shared for
ingestion plus monitoring purposes, while a production cluster and monitoring
cluster setup will require separate keys.

{ls} can send both collected data and monitoring information to {es}. If you are
sending both to the same cluster, you can use the same API key. For different
clusters, you need to use an API key per cluster.
/////




[float]
[[learn-more-api-keys]]
===== Learn more about API keys
Expand Down
4 changes: 2 additions & 2 deletions docs/static/settings-file.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ pipeline.batch.size: 125
pipeline.batch.delay: 50
-------------------------------------------------------------------------------------

The `logstash.yml` file also supports bash-style interpolation of environment variables in
setting values.
The `logstash.yml` file also supports bash-style interpolation of environment variables and
keystore secrets in setting values.

[source,yaml]
-------------------------------------------------------------------------------------
Expand Down
4 changes: 3 additions & 1 deletion logstash-core/lib/logstash/agent.rb
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,9 @@ def converge_state_and_update

converge_result
rescue => e
logger.error("An exception happened when converging configuration", :exception => e.class, :message => e.message, :backtrace => e.backtrace)
attributes = {:exception => e.class, :message => e.message}
attributes.merge!({:backtrace => e.backtrace}) if logger.debug?
logger.error("An exception happened when converging configuration", attributes)
end

# Calculate the Logstash uptime in milliseconds
Expand Down
Loading