[SOLVED] MongoDB: Slow query

I have…

I’m submitting a…

  • [ ] Regression (a behavior that stopped working in a new release)
  • [ ] Bug report
  • [x] Performance issue
  • [ ] Documentation issue or request

Current behavior

I checked logs recently a i noticed (I finally set up logging mongo to seq) there are lot of “Slow query” in mongoDb. I am not sure if it is related to any change as I recently updated docker to the last version and i unfortunately cannot check if it is same.

Raw log entry:

{
  "command": {
"$db": "Squidex",
"filter": {
  "EventStream": {
    "$regularExpression": {
      "options": "",
      "pattern": "^app-|^app-"
    }
  },
  "Timestamp": {
    "$gt": {
      "$timestamp": {
        "i": 1,
        "t": 1664745154
      }
    }
  }
},
"find": "Events2",
"limit": 2147483647,
"lsid": {
  "id": {
    "$uuid": "982c73ba-cd2d-4216-8f58-831afbb6aa87"
  }
},
"sort": {
  "EventStream": -1,
  "Timestamp": 1
}
  },
  "cursorExhausted": true,
  "docsExamined": 0,
  "durationMillis": 1384,
  "fromMultiPlanner": true,
  "keysExamined": 154022,
  "locks": {
"FeatureCompatibilityVersion": {
  "acquireCount": {
    "r": 925
  }
},
"Global": {
  "acquireCount": {
    "r": 925
  }
},
"Mutex": {
  "acquireCount": {
    "r": 1
  }
}
  }
}

It’s almost exclusively "^app-|^app-" only other what i found was with '^schema-', '^assetFolder-', '^asset-' but that was just few occasions which may be related to something else (GC?). "^app-|^app-" happen all the time.

Expected behavior

No slow query or should be logged is Squidex? Or I can consider it normal as i do not think I notice anything not working.

Minimal reproduction of the problem

It is tricky as this is log from mongo so I do not have exact context from squidex.

Environment

  • [x] Self hosted with docker
  • [ ] Self hosted with IIS
  • [ ] Self hosted with other version
  • [ ] Cloud version

Version:7.8.2

The main time on this is spent on locks, so I would not be worried. But I actually got an idea how to improve that. Don’t know why it took so many years, but perhaps I have already tried that and it didn’t work

Hi,

it is already a bug. It turned out that when you have a regex like ^app-, MongoDB can use the index B-Tree to check for that index (which makes sense), but with ^app-|^app- they have to check the regex on every single indexed field, which is way slower. I have made a general improvement to the event store query system.

I may notice some other Slow query in log but that may not be related to same problem (as it is mostly likely during batch loading all assets). Should i post it there or wait for more “evidence” and open a new thread then?

Yes, why not. But not all slow queries are general performance problems. Sometimes it is just the hardware.

Yes I am aware of that and mostly likely other queries are

#1

{
  "command": {
    "$db": "SquidexContent",
    "aggregate": "States_Contents_Published3",
    "cursor": {},
    "lsid": {
      "id": {
        "$uuid": "7e1f2c1d-fe1b-4974-a9cb-c87dbd450461"
      }
    },
    "pipeline": [
      {
        "$match": {
          "_ai": "97432068-10f9-4b98-81ba-ef93a96cc466",
          "_si": {
            "$in": [
              "5769cadb-3be6-4a79-bb39-1a9d23cb418c"
            ]
          },
          "dl": {
            "$ne": true
          },
          "id": {
            "$gt": "00000000-0000-0000-0000-000000000000"
          },
          "mt": {
            "$gt": {
              "$date": "1970-01-01T00:00:00.000Z"
            }
          }
        }
      },
      {
        "$group": {
          "_id": 1,
          "n": {
            "$sum": 1
          }
        }
      }
    ]
  },
  "cursorExhausted": true,
  "docsExamined": 16168,
  "durationMillis": 115,
  "keysExamined": 16169,
  "locks": {
    "FeatureCompatibilityVersion": {
      "acquireCount": {
        "r": 19
      }
    },
    "Global": {
      "acquireCount": {
        "r": 19
      }
    },
    "Mutex": {
      "acquireCount": {
        "r": 2
      }
    }
  },
  "nreturned": 1,
  "ns": "SquidexContent.States_Contents_Published3",
  "numYields": 17,
  "planCacheKey": "4FE64636",
  "planSummary": "IXSCAN { _si: 1, dl: 1, mt: -1 }",
  "protocol": "op_msg",
  "queryFramework": "classic",
  "queryHash": "EDBA221A",
  "remote": "10.0.28.4:39248",
  "reslen": 154,
  "storage": {},
  "type": "command"
}

#2

{
  "command": {
    "$db": "Squidex",
    "findAndModify": "Queue_rules.run",
    "lsid": {
      "id": {
        "$uuid": "91764684-8c7e-4f66-a604-7993840bef8e"
      }
    },
    "query": {
      "QueueName": {
        "$ne": null
      },
      "TimeHandled": null,
      "TimeToLive": {
        "$gt": {
          "$date": "2023-10-10T17:42:49.799Z"
        }
      }
    },
    "update": {
      "$set": {
        "TimeHandled": {
          "$date": "2023-10-10T17:42:49.799Z"
        }
      }
    }
  },
  "docsExamined": 0,
  "durationMillis": 118,
  "flowControl": {
    "acquireCount": 1,
    "timeAcquiringMicros": 2
  },
  "keysExamined": 0,
  "locks": {
    "Collection": {
      "acquireCount": {
        "w": 1
      }
    },
    "Database": {
      "acquireCount": {
        "w": 1
      }
    },
    "FeatureCompatibilityVersion": {
      "acquireCount": {
        "w": 1
      }
    },
    "Global": {
      "acquireCount": {
        "w": 1
      }
    },
    "Mutex": {
      "acquireCount": {
        "r": 1
      }
    },
    "ParallelBatchWriterMode": {
      "acquireCount": {
        "r": 1
      }
    },
    "ReplicationStateTransition": {
      "acquireCount": {
        "w": 1
      }
    }
  },
  "nMatched": 0,
  "nModified": 0,
  "nUpserted": 0,
  "ns": "Squidex.Queue_rules.run",
  "numYields": 0,
  "planCacheKey": "25A7A67C",
  "planSummary": "IXSCAN { TimeHandled: 1, QueueName: 1 }",
  "protocol": "op_msg",
  "queryHash": "5A05C331",
  "remote": "10.0.28.4:33172",
  "reslen": 92,
  "storage": {},
  "type": "command"
}

SquidexContent.States_Contents_Published3 is in log a lot also but at same time it may be related to HW/server issue.

1 Like

Is this something what was already resolved currently? If yes i would be happy to test it if you release new version :slight_smile: as we have currently still lots of these messages in log

This topic was automatically closed after 2 days. New replies are no longer allowed.