0

I have some problems parsing json logs that were received from docker container. I know this question is probably a duplicate but none of the solutions found, including the documentation (https://docs.fluentd.org/filter/parser), helped.

Now my fluentd.conf like this:

  <source>
    @type forward
    port 24224
    bind "0.0.0.0"
  </source>

  <match mylog>
    @type opensearch
    @log_level "info"
    host "opensearch-node1"
    port 9200
    logstash_format true
    logstash_dateformat "%Y%m%d"
    logstash_prefix "mylog"
  </match>

  <filter **>
    @type parser
    key_name "log"
    hash_value_field "log"
    reserve_data true
    <parse>
      @type "json"
    </parse>
  </filter>

But it feels like the filter is not working because the log field does not change

Result in OpenSearch:


{
  "_index": "mylog-20221024",
  "_id": "93cECoQBW3_q0OULbk2_",
  "_version": 1,
  "_score": null,
  "_source": {
    "container_id": "47de762b4d113478ee125417abd9e6ec7d963aa0d77758fc92cf3a18b0d2ad86",
    "container_name": "/mylog",
    "source": "stdout",
    "log": "{\"time\":\"2022-10-24T15:42:48.480757224+03:00\",\"id\":\"\",\"remote_ip\":\"10.73.133.144\",\"host\":\"localhost:80\",\"method\":\"GET\",\"uri\":\"/metrics\"}",
    "@timestamp": "2022-10-24T12:42:48.000000000+00:00"
  },
  "fields": {
    "@timestamp": [
      "2022-10-24T12:42:48.000Z"
    ]
  },
  "sort": [
    1666615368000
  ]
}

Expected:


{
  "_index": "mylog-20221024",
  "_id": "93cECoQBW3_q0OULbk2_",
  "_version": 1,
  "_score": null,
  "_source": {
    "container_id": "47de762b4d113478ee125417abd9e6ec7d963aa0d77758fc92cf3a18b0d2ad86",
    "container_name": "/mylog",
    "source": "stdout",
    "log": "{\"time\":\"2022-10-24T15:42:48.480757224+03:00\",\"id\":\"\",\"remote_ip\":\"10.73.133.144\",\"host\":\"localhost:80\",\"method\":\"GET\",\"uri\":\"/metrics\"}",
    "time": "2022-10-24T15:42:48.480757224+03:00",
    "id":"",
    "remote_ip":"10.73.133.144",
    "host":"localhost:80",
    "method":"GET",
    "uri":"/metrics",
    "@timestamp": "2022-10-24T12:42:48.000000000+00:00",
  },
  "fields": {
    "@timestamp": [
      "2022-10-24T12:42:48.000Z"
    ]
  },
  "sort": [
    1666615368000
  ]
}

Thanks!

5
  • 1
    From your expected output, it looks like you're trying to enhance the log by adding more fields to it. You might want to look at docs.fluentd.org/filter/record_transformer or github.com/repeatedly/fluent-plugin-record-modifier.
    – Azeem
    Commented Oct 25, 2022 at 6:59
  • Thanks! I try extract inner tags in json from "log". docs.fluentd.org/filter/record_transformer looks like what can help but I don't quite understand how to do what I need.
    – nikhrom
    Commented Oct 25, 2022 at 7:41
  • 1
    Sure, no problem. Yes, the record_transformer should do what you want. Go through its documentation, examples, and other relevant SO Q&A. Hopefully, you'll get it to work as expected. Good luck!
    – Azeem
    Commented Oct 25, 2022 at 8:31
  • Unfortunately it looks like I can extract only previously known fields from json while I need to extract ALL fields from "log"
    – nikhrom
    Commented Oct 25, 2022 at 8:41
  • 1
    The value of log key is a string literal. You need to convert that to JSON and then you can access the nested fields and create new ones from them. Please take a look at the previous Q&A tagged fluentd. IIRC, there have been similar questions and you can leverage the answers and discussions from those threads.
    – Azeem
    Commented Oct 25, 2022 at 10:10

1 Answer 1

1

The solution to the problem turned out to be very simple. The <filter> should be placed before the <match>. Now my fluentd.conf like this:

  <source>
    @type forward
    port 24224
    bind "0.0.0.0"
  </source>

  <filter mylog**>
    @type parser
    format json
    key_name log
    reserve_data true
    reserve_time true
    inject_key_prefix logs.
  </filter>

  <match mylog>
    @type opensearch
    @log_level "info"
    host "opensearch-node1"
    port 9200
    logstash_format true
    logstash_dateformat "%Y%m%d"
    logstash_prefix "mylog"
  </match>

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.