Skip to main content

All Questions

Tagged with
Filter by
Sorted by
Tagged with
0 votes
1 answer
152 views

fluentd output to s3 failed

I was trying to put logs produced by an application on docker-compose to my s3 bucket using fluentd, but got the following error fluentd | 2024-07-04 09:01:48 +0000 [error]: #0 /usr/lib/...
Garou's user avatar
  • 11
0 votes
0 answers
325 views

Remove excess line breaks from s3 log files (fluent-bit s3 output plugin)

I am using fluent-bit s3 output plugin to upload Kubernetes pod logs to s3. I see excessive line breaks in s3 log files as below: 2024-01-24 10:03:34.510 [65b0e07526a14752251fdf7a2e309f58] INFO [Log] ...
Elnur Mammadov's user avatar
0 votes
1 answer
326 views

How to hide s3 bucket and access key credentials from fluentd config file

We are sending the logs from on-prem servers to the S3 bucket by using Fluentd. The below code configuration is from my Fluentd config file. So wondering is there a way to mask/hide/secure our ...
SriniDK's user avatar
  • 79
2 votes
0 answers
310 views

fluentd S3 use part of log tag in bucket path

Is there a way to use first part of the tag in bucket path while uploading to aws s3 using fluentd? I can use the entire tag like this: <store> @type s3 s3_bucket "#{ENV['S3_BUCKET']}&...
Datrix-A's user avatar
2 votes
1 answer
2k views

FluentBit S3 upload with container name as key in s3

My log file name in s3 looks like kube.var.log.containers.development-api-connect-green-58db8964cb-wrzg5_default_api-connect-fa7cafd99a1bbb8bca002c8ab5e3b2aefc774566bb7e9eb054054112f43f1e87.log/ here ...
Vikas Shaw's user avatar
0 votes
1 answer
780 views

FluentD uploads new file, instead of rotated file to s3 for hourly log rotation of log files

We have a python application that generates hourly rotating log, and we have set up the time for each rotation as start of each hour, i.e. the log rotation would happen at 10:00, 11:00, 12:00 .... The ...
Sameer Ranjha's user avatar
0 votes
0 answers
865 views

gsub in FluentD and S3 not substituting unicode characters

Message appears as this string in S3 bucket: '\u00001\u00001\u0000/\u00002\u00003\u0000/\u00002\u00000\u00002\u00001\u0000 \u00001\u00007\u0000:\u00004\u00005\u0000,\u0000s\u0000e\u0000v\u0000e\...
helpaccount321's user avatar
1 vote
0 answers
878 views

s3_out: unable to sign request without credentials set

I try to use "instance_profile_credentials" at ec2 instance as credentials. However I get 2021-09-16 14:16:50 +0000 [error]: #0 unexpected error error_class=RuntimeError error="can't ...
carfield's user avatar
  • 2,081
0 votes
1 answer
666 views

how to use fields from record transformer in S3 path fluentd-output-s3 plugin

we have below record transformer config in our fluentd pipeline: <filter docker.**> @type record_transformer enable_ruby true <record> servername as1 hostname "#{Socket....
chitender kumar's user avatar
0 votes
1 answer
748 views

Fluentd - S3 output plugin - Impossible to use environment variable as part of S3 path

Here is my use case: We have hundreds of kubernetes pods that generate logs and send them to S3. For performance issues with the current solution, we are trying to implement fluentd as sidecars to ...
Thierry Montalbano's user avatar
0 votes
1 answer
1k views

How to remove unicode in fluentd tail/s3 plugin

I have fluentd configuration with source as tail type and destination as aws s3. I could able store the logs in S3. we have enabled coloring in the application logs, based on log levels in winston ...
Prakash26790's user avatar
15 votes
3 answers
25k views

Loki config with s3

I can't get Loki to connect to AWS S3 using docker-compose. Logs are visible in Grafana but the S3 bucket remains empty. The s3 bucket is public and I have an IAM role attached to allow s3:FullAccess. ...
markhorrocks's user avatar
  • 1,358
2 votes
2 answers
3k views

how to add multiple outputs in fluentd-kubernetes-daemonset in kubernetes

I'm using that fluentd daemonset docker image and sending logs to ES with fluentd is working perfectly by the way of using following code-snippets: containers: - name: fluentd image: ...
Pyae Phyoe Shein's user avatar
0 votes
1 answer
1k views

How can I read logs from a AWS SQS queue into FluentD? [closed]

I'm trying to read some logs in my AWS SQS queue into fluentd. I thought the fluent-plugin-s3 plugin takes care of this but after reading the documentation it seems that it only writes to an S3 bucket....
Chesneycar's user avatar
0 votes
1 answer
1k views

Logs shipped with wrong timestamp and timekey ignored

I want to ship my Vault logs to s3. Based on this issue I did this: ## vault input <source> @type tail path /var/log/vault_audit.log pos_file /var/log/td-agent/vault.audit_log.pos <...
Juicy's user avatar
  • 12.5k
6 votes
1 answer
14k views

How to send logs to multiple outputs with same match tags in Fluentd?

I have a Fluentd instance, and I need it to send my logs matching the fv-back-* tags to Elasticsearch and Amazon S3. Is there a way to configure Fluentd to send data to both of these outputs? Right ...
Pedro Henrique's user avatar
1 vote
0 answers
494 views

Is it possible to set host in fluentd s3_output

Is it possible to set somehow %{host} (not %{hostmane}: it is point to local fluentd server) in the fluentd S3 path like: s3://logs/2018/07/10/web01/misc-20180710-07_1.gz host is one of the message ...
Andrii Petrenko's user avatar
1 vote
1 answer
2k views

How to add portion of timestamp to Fluentd output file format

How to add certain portion of time stamp to output file generated by fluentd? In the config file: s3_object_key_format generic-logs/%{time_slice}/out-%{H}-%{M}-%{S}-%{index}.log time_slice_format %...
moooni moon's user avatar
1 vote
0 answers
1k views

Is it possible to run fluentd as a AWS Lambda?

I have logs in data lake. I want to parse and push those logs AWS S3 using fluent-s3 output plugin. Here i want to run fluentd as a AWS lambda which it will trigger and push the data lake files into ...
KISHORE's user avatar
  • 21
0 votes
1 answer
2k views

Fluentd not writing logs into amazon s3

I am trying to test the fluentd-s3-plugin yet at the moment it is not posting my logs into s3 bucket. I am running everything on ubuntu xenial, having installed fluentd with td-agent. The following is ...
francotestori's user avatar
0 votes
1 answer
212 views

How to upload 1 hour old files to S3 using fluentd?

I have successfully upload files to S3 using fluentd. I am trying to upload one hour old files to S3 using fluentd. Eg. fluentd only upload files which are 1 hour old and it should not upload files ...
abc's user avatar
  • 1
3 votes
0 answers
2k views

fluentd s3 output plugin configuration

I am trying to get the out_s3 for fluentd working from the past 2 days, I am unable to see the logs on my s3 This is my current config: <match web.all> type s3 aws_key_id .........
pixelscreen's user avatar
  • 1,975
2 votes
1 answer
3k views

Fluentd copy output plugin not sending same data to both destinations?

I have a web server stack with multiple nodes (auto scaling group), and each web server is configured to use Fluentd to forward log files to a central collector which saves the logs in an S3 bucket. ...
Petar Zivkovic's user avatar
1 vote
2 answers
2k views

td-agent is not working after update

Td-agent is not working after yum update on Amazon Linux. Td-agent 1.1.20-0 worked, but 1.1.21-0 is not working and no log data in s3. error message is 2015-01-15 06:36:40 +0900 [error]: failed to ...
richasonson's user avatar
1 vote
0 answers
640 views

Fluentd can't upload buffer file to s3 when every hour on the hour

Hi everyone: I use fluentd collect my log and these log are generated by my python program, so use forward as the source parameter. But the file that be generated on every hour on the hour will be ...
magigo's user avatar
  • 176