Existing methods of exploring a search space are primarily limited to a form iterative compilation which requires the exploration of the entire space. Many times this optimization space is too large to exhaustively search. With an autotuning framework, the search space for optimizations can be explored automatically and intelligently without any need for user interaction.
Many applications cannot be rewritten, but maintainers are expected to improve the performance. Directive-based languages allow a developer to annotate the existing program for guiding features such as specific optimizations to apply, auto-parallelization of code regions, and even targeting accelerators (e.g. NVIDIA GPUs and Intel Coprocessors) automatically.
Directive-based languages are often inflexible to various programming models easily without resorting to (ab)use of the preprocessor. Programming models such as RAJA place that burden on efficient tag dispatching and SFINAE in modern C++. I leverage RAJA as a tuning framework, which enables autotuning on directive-based languages by simply changing a single C++ type.