ola hallengren – Question about maintenance performance of SQL Server indexes and statistics

I tried to use IndexOptimize stored procedure on a database with a very large number of indexes (> 250,000). The initial stage where the stored procedure collects the data, which needs to be processed takes hours, even if I set the @Indexes parameter to limit the job.

SQL Server Maintenance Solution Version: 2019-02-10 10:40:47 SQL
Server 2017 Standard Edition with the latest CU14 installed.

At a client, I saw a database containing more than 500,000 indexes. After 12 hours, the data collection stage was still running.

I would expect that if I set @Indexes on a single index, the run should be started immediately.

Here is an example of my stored procedure call.

EXECUTE dbo.IndexOptimize
@Databases = & # 39; db & # 39;
@Indexes = & # 39; db.dbo. & # 39;
@FragmentationLow = NULL,
@FragmentationMedium = & # 39; INDEX_REORGANIZE, INDEX_REBUILD_ONLINE, INDEX_REBUILD_OFFLINE & # 39 ;,
@FragmentationHigh = & # 39; INDEX_REBUILD_ONLINE, INDEX_REBUILD_OFFLINE & # 39;
FragmentationLevel1 = 30,
FragmentationLevel2 = 50,
@UpdateStatistics = & # 39; ALL & # 39;
@OnlyModifiedStatistics = 'Y',
@LogToTable = 'Y',
@LockTimeout = 60

Can any one share with me his experience using IndexOptimize for a database containing a very large number of indexes?