Skip to content

Conversation

@ktff
Copy link
Contributor

@ktff ktff commented May 15, 2023

Closes #1374

Implementation of parquet codec for aws_s3 sink.

Usage, as with avro codec, define codec as parquet and define encoding.parquet.schema.
Example:

[sinks.test_aws.encoding]
  codec = "parquet"
  parquet.schema = '''
  	message test {
  	    required binary message; 
  	}
  '''

Currently there is no enforcement of the schema before the serializer so if a single event doesn't satisfy the schema it will fail the entire batch.

Is not meant for merger, will remain in draft until batch codecs have landed. Regarding that, the serializer itself is generic, what's not so ok is the way it's woven into current codec abstractions. It can also be viewed as an example case of where the problematic spots are.

Todo/Extensions

  • Documentation
  • Logical types (MAP and LIST are supported)
  • Column compression (Future Extension)
  • Event definition/schema mismatch:
    * Try to enforce schema in the sink through schema_requirement. Unfortunately it doesn't seem possible at the moment. The problem is in the parquet schema not having meaning for it's fields which schema_requirement requires.
    * Or, fail only invalid event not the whole batch.

@netlify
Copy link

netlify bot commented May 15, 2023

Deploy Preview for vrl-playground ready!

Name Link
🔨 Latest commit e90d2c9
🔍 Latest deploy log https://app.netlify.com/sites/vrl-playground/deploys/6497409be56d040008db2d2d
😎 Deploy Preview https://deploy-preview-17395--vrl-playground.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@netlify
Copy link

netlify bot commented May 15, 2023

Deploy Preview for vector-project ready!

Name Link
🔨 Latest commit e90d2c9
🔍 Latest deploy log https://app.netlify.com/sites/vector-project/deploys/6497409b927264000871c3ea
😎 Deploy Preview https://deploy-preview-17395--vector-project.netlify.app/reports/lighthouse
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@github-actions github-actions bot added domain: codecs Anything related to Vector's codecs (encoding/decoding) domain: sinks Anything related to the Vector's sinks labels May 15, 2023
@ktff ktff changed the title enhancement(aws_s3 sink): Add `parquet codec enhancement(aws_s3 sink): Add parquet codec May 15, 2023
@github-actions
Copy link

Regression Detector Results

Run ID: 3ba1be74-2688-4cb6-95f0-9bdc49026042
Baseline: 6c57ca0
Comparison: efaf549
Total vector CPUs: 7

Explanation

A regression test is an integrated performance test for vector in a repeatable rig, with varying configuration for vector. What follows is a statistical summary of a brief vector run for each configuration across SHAs given above. The goal of these tests are to determine quickly if vector performance is changed and to what degree by a pull request.

Because a target's optimization goal performance in each experiment will vary somewhat each time it is run, we can only estimate mean differences in optimization goal relative to the baseline target. We express these differences as a percentage change relative to the baseline target, denoted "Δ mean %". These estimates are made to a precision that balances accuracy and cost control. We represent this precision as a 90.00% confidence interval denoted "Δ mean % CI": there is a 90.00% chance that the true value of "Δ mean %" is in that interval.

We decide whether a change in performance is a "regression" -- a change worth investigating further -- if both of the following two criteria are true:

  1. The estimated |Δ mean %| ≥ 5.00%. This criterion intends to answer the question "Does the estimated change in mean optimization goal performance have a meaningful impact on your customers?". We assume that when |Δ mean %| < 5.00%, the impact on your customers is not meaningful. We also assume that a performance change in optimization goal is worth investigating whether it is an increase or decrease, so long as the magnitude of the change is sufficiently large.

  2. Zero is not in the 90.00% confidence interval "Δ mean % CI" about "Δ mean %". This statement is equivalent to saying that there is at least a 90.00% chance that the mean difference in optimization goal is not zero. This criterion intends to answer the question, "Is there a statistically significant difference in mean optimization goal performance?". It also means there is no more than a 10.00% chance this criterion reports a statistically significant difference when the true difference in mean optimization goal is zero -- a "false positive". We assume you are willing to accept a 10.00% chance of inaccurately detecting a change in performance when no true difference exists.

The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values of "Δ mean %" mean that baseline is faster, whereas positive values of "Δ mean %" mean that comparison is faster. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed.

No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%.

Fine details of change detection per experiment.
experiment goal Δ mean % Δ mean % CI confidence
datadog_agent_remap_blackhole ingress throughput +2.68 [+2.58, +2.78] 100.00%
splunk_hec_route_s3 ingress throughput +1.40 [+1.26, +1.54] 100.00%
syslog_loki ingress throughput +1.03 [+0.94, +1.12] 100.00%
syslog_log2metric_humio_metrics ingress throughput +0.87 [+0.73, +1.01] 100.00%
otlp_grpc_to_blackhole ingress throughput +0.80 [+0.68, +0.92] 100.00%
syslog_regex_logs2metric_ddmetrics ingress throughput +0.75 [+0.47, +1.03] 99.94%
syslog_humio_logs ingress throughput +0.44 [+0.35, +0.53] 100.00%
syslog_log2metric_splunk_hec_metrics ingress throughput +0.23 [+0.12, +0.34] 99.21%
http_text_to_http_json ingress throughput +0.23 [+0.13, +0.33] 99.68%
file_to_blackhole ingress throughput +0.03 [-0.01, +0.08] 63.94%
enterprise_http_to_http ingress throughput +0.01 [-0.01, +0.04] 46.83%
splunk_hec_indexer_ack_blackhole ingress throughput +0.01 [-0.04, +0.05] 14.13%
splunk_hec_to_splunk_hec_logs_acks ingress throughput +0.00 [-0.06, +0.07] 1.24%
fluent_elasticsearch ingress throughput +0.00 [-0.00, +0.00] 25.73%
http_to_http_noack ingress throughput -0.01 [-0.07, +0.04] 25.24%
splunk_hec_to_splunk_hec_logs_noack ingress throughput -0.02 [-0.06, +0.03] 37.23%
http_to_http_acks ingress throughput -0.11 [-1.34, +1.12] 9.04%
datadog_agent_remap_blackhole_acks ingress throughput -0.16 [-0.25, -0.07] 97.77%
http_to_http_json ingress throughput -0.60 [-0.68, -0.53] 100.00%
syslog_splunk_hec_logs ingress throughput -0.77 [-0.84, -0.70] 100.00%
socket_to_socket_blackhole ingress throughput -1.25 [-1.30, -1.20] 100.00%
otlp_http_to_blackhole ingress throughput -1.31 [-1.45, -1.16] 100.00%
datadog_agent_remap_datadog_logs_acks ingress throughput -1.73 [-1.83, -1.63] 100.00%
datadog_agent_remap_datadog_logs ingress throughput -2.46 [-2.57, -2.35] 100.00%

@ktff
Copy link
Contributor Author

ktff commented Jun 22, 2023

Not supporting logical types LIST and MAP from the start can cause significant confusion so I'll add it.

@jszwedko
Copy link
Member

jszwedko commented Aug 1, 2023

@ktff are you still working on this one?

@ktff
Copy link
Contributor Author

ktff commented Aug 2, 2023

@jszwedko I am, but how I understand it it's blocked on support for batched codecs. Other than that, this PR is missing documentation and decision on how to deal with Events that don't conform to schema.

@jszwedko
Copy link
Member

jszwedko commented Aug 2, 2023

@jszwedko I am, but how I understand it it's blocked on support for batched codecs. Other than that, this PR is missing documentation and decision on how to deal with Events that don't conform to schema.

Ah, gotcha. I'm not sure when we'll get to adding the batched codec support, but if that is something you are interested in we'd be happy to help 🙂 .

@satellite-no
Copy link

Any movement on this PR?

@pront pront force-pushed the master branch 4 times, most recently from 1720078 to ffe54be Compare July 10, 2025 15:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

domain: codecs Anything related to Vector's codecs (encoding/decoding) domain: sinks Anything related to the Vector's sinks

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support parquet columnar format in the aws_s3 sink

3 participants