One wrinkle I see with this: Say Ableton allows you to add something in the project that you browsed to outside of the database and Live continues to index in the background. How does Live then change those references to what is stored in the database after indexing is complete?androids wrote: The solution is plain simple : continue indexing in the background but allow to browse the whole structure even if background indexer has not finished its work ...
If ableton stops at that bugfix :
"Scanning folders containing a huge amount of files could take longer than necessary"
then they are just missing the important point.
What if they could get the indexing down to a few minutes or less instead of many many minutes? Would that be acceptable?
There are best practices for inserting thousands and thousands of rows of data into a database and lots of them depend on different things: how many index's you have so that when you add a record, does it also update those index's at the same time or does it insert the data and THEN reindex? What do you do with record updates? How about reads? Are their queries optimized and do they read with no lock (dirty reads) or do they lock while reading and now the table can't index because another process locked the table?
There are sooooo many scenarios. The important thing is there are TONS of resources available for database optimization. Its just data. Lots of people make a living off tuning and creating better database models. Ableton can get, and I'm sure IS getting, the resources they need to make this work.
That said, has anyone tested 9.0.3b1 to see the difference?