View Single Post
Old 12-11-2021, 06:12 AM   #203
capink
Wizard
capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.capink ought to be getting tired of karma fortunes by now.
 
Posts: 1,201
Karma: 1995558
Join Date: Aug 2015
Device: Kindle
Quote:
Originally Posted by chaley View Post
One problem is that the work to compute the time could easily exceed the time to compute the template. I think that I would need to add profiling to the formatter itself to avoid this.

Another problem is that the template time depends on the metadata, especially if there are conditionals. For example, the time to split and join tags depends on the number of tags. This argues for some ability to average multiple invocations of the template.

Finally, the execution context matters. By this I mean whether the template is being used in search/sort, and for display purposes whether the column value has been cached. In the latter case the formatter doesn't even know it was used.

My feeling: if templates are complex enough to need this level of profiling then one should consider implementing them as custom functions. Those are compiled python and run at full python speed. You can also avoid extra split/join operations because you know how you will use the information.
Thanks for the informative and comprehensive reply.
capink is offline   Reply With Quote