Subtopic Deep Dive
Derivative-Free Optimization
Research Guide
What is Derivative-Free Optimization?
Derivative-Free Optimization (DFO) designs model-based and direct-search algorithms to minimize black-box functions using only objective evaluations without derivative information.
DFO employs interpolation models, trust-region frameworks, and radial basis functions for bound-constrained and noisy problems. Rios and Sahinidis (2012) review algorithms and software, citing 1229 times. Cartis and Roberts (2022) introduce scalable subspace methods for nonlinear least-squares with 38 citations.
Why It Matters
DFO enables optimization of expensive simulations in engineering where gradients are unavailable, such as biomedical imaging (Oeuvray, 2005; 34 citations) and cerebral palsy gait modeling (Hatz, 2014; 25 citations). Rios and Sahinidis (2012) compare software for bound-constrained problems, applied in tuning solvers like BARON (Liu et al., 2018; 31 citations). Πlóskas and Sahinidis (2021) extend to mixed-integer cases (30 citations), impacting industrial design with unreliable derivatives.
Key Research Challenges
Handling Noisy Evaluations
Noise in black-box functions degrades model accuracy in trust-region methods. Oeuvray and Bierlaire (2009) address this with radial basis functions (41 citations). Robust sampling strategies remain needed for high-dimensional cases.
Scalability to High Dimensions
Interpolation models suffer from curse of dimensionality beyond 10 variables. Cartis and Roberts (2022) propose random subspace minimization for large-scale least-squares (38 citations). Probabilistic complexity bounds require further tightening.
Mixed-Integer Extensions
Combining continuous DFO with discrete decisions lacks efficient hybrids. Πlóskas and Sahinidis (2021) review algorithms for bound-constrained mixed-integer DFO (30 citations). Global optimality guarantees are computationally prohibitive.
Essential Papers
Derivative-free optimization: a review of algorithms and comparison of software implementations
Luis Miguel Rios, Nikolaos V. Sahinidis · 2012 · Journal of Global Optimization · 1.2K citations
Abstract This paper addresses the solution of bound-constrained optimization problems using algorithms that require only the availability of objective function values but no derivative information....
BOOSTERS: A DERIVATIVE-FREE ALGORITHM BASED ON RADIAL BASIS FUNCTIONS
Rodrigue Oeuvray, Michel Bierlaire · 2009 · International Journal of Modelling and Simulation · 41 citations
Derivative-free optimization involves the methods used to minimize an expensive objective functionwhen its derivatives are not available. We present here a trust-region algorithmbased on Radial Bas...
Scalable subspace methods for derivative-free nonlinear least-squares optimization
Coralia Cartis, Lindon Roberts · 2022 · Mathematical Programming · 38 citations
Abstract We introduce a general framework for large-scale model-based derivative-free optimization based on iterative minimization within random subspaces. We present a probabilistic worst-case com...
Trust-region methods based on radial basis functions with application to biomedical imaging
Rodrigue Oeuvray · 2005 · Infoscience (Ecole Polytechnique Fédérale de Lausanne) · 34 citations
We have developed a new derivative-free algorithm based on Radial Basis Functions (RBFs). Derivative-free optimization is an active field of research and several algorithms have been proposed recen...
Tuning BARON using derivative-free optimization algorithms
Jianfeng Liu, Νικόλαος Πλόσκας, Nikolaos V. Sahinidis · 2018 · Journal of Global Optimization · 31 citations
Review and comparison of algorithms and software for mixed-integer derivative-free optimization
Νικόλαος Πλόσκας, Nikolaos V. Sahinidis · 2021 · Journal of Global Optimization · 30 citations
Abstract This paper reviews the literature on algorithms for solving bound-constrained mixed-integer derivative-free optimization problems and presents a systematic comparison of available implemen...
Chapter 37: Methodologies and Software for Derivative-Free Optimization
A. L. Custódio, Katya Scheinberg, L. N. Vicente · 2017 · Society for Industrial and Applied Mathematics eBooks · 27 citations
Derivative-free optimization (DFO) methods [502] are typically considered for the minimization/maximization of functions for which the corresponding derivatives neither are available for use nor ca...
Reading Guide
Foundational Papers
Start with Rios and Sahinidis (2012; 1229 citations) for algorithm/software review, then Oeuvray and Bierlaire (2009; 41 citations) for RBF trust-region details essential to understanding black-box minimization.
Recent Advances
Study Cartis and Roberts (2022; 38 citations) for subspace scalability and Πlóskas and Sahinidis (2021; 30 citations) for mixed-integer advances building on foundational reviews.
Core Methods
Trust-region interpolation (Rios and Sahinidis, 2012), radial basis functions (Oeuvray, 2005), random subspace least-squares (Cartis and Roberts, 2022).
How PapersFlow Helps You Research Derivative-Free Optimization
Discover & Search
Research Agent uses searchPapers and citationGraph on 'derivative-free optimization' to map Rios and Sahinidis (2012; 1229 citations) as central hub, then findSimilarPapers reveals RBF methods like Oeuvray and Bierlaire (2009). exaSearch uncovers niche applications in biomedical imaging from Oeuvray (2005).
Analyze & Verify
Analysis Agent applies readPaperContent to extract trust-region details from Cartis and Roberts (2022), then verifyResponse with CoVe checks complexity claims against Rios and Sahinidis (2012). runPythonAnalysis reimplements subspace sampling in NumPy sandbox; GRADE assigns A for empirical software comparisons in Liu et al. (2018).
Synthesize & Write
Synthesis Agent detects gaps in high-dimensional mixed-integer DFO via contradiction flagging between Πlóskas and Sahinidis (2021) and model-based reviews. Writing Agent uses latexEditText for algorithm pseudocode, latexSyncCitations for Rios et al. bibliography, and latexCompile for camera-ready survey; exportMermaid diagrams trust-region RBF flows.
Use Cases
"Reproduce BOOSTERS RBF algorithm performance on noisy functions"
Research Agent → searchPapers('Oeuvray Bierlaire 2009') → Analysis Agent → readPaperContent + runPythonAnalysis (NumPy RBF interpolation on test suite) → matplotlib convergence plot verifying 41-citation claims.
"Write LaTeX review comparing DFO software for engineering design"
Research Agent → citationGraph('Rios Sahinidis 2012') → Synthesis → gap detection → Writing Agent → latexEditText (survey draft) → latexSyncCitations (30+ refs) → latexCompile → PDF with trust-region diagrams.
"Find open-source code for scalable subspace DFO methods"
Research Agent → searchPapers('Cartis Roberts 2022') → Code Discovery → paperExtractUrls → paperFindGithubRepo → githubRepoInspect → Python snippets for random subspace minimization (38 citations validated).
Automated Workflows
Deep Research workflow scans 50+ DFO papers via searchPapers → citationGraph clustering around Rios and Sahinidis (2012) → structured report with RBF vs. interpolation taxonomy. DeepScan's 7-step chain verifies Oeuvray (2005) biomedical claims: readPaperContent → CoVe → runPythonAnalysis on imaging data. Theorizer generates hypotheses for noisy mixed-integer DFO from Πlóskas and Sahinidis (2021) gaps.
Frequently Asked Questions
What defines Derivative-Free Optimization?
DFO minimizes black-box functions using only objective values, via model-based (interpolation, RBF) or direct-search methods without gradients (Rios and Sahinidis, 2012).
What are core DFO methods?
Trust-region with radial basis functions (Oeuvray and Bierlaire, 2009), subspace minimization (Cartis and Roberts, 2022), and software like NOMAD or Py-BOBYQA (Custódio et al., 2017).
What are key DFO papers?
Foundational: Rios and Sahinidis (2012; 1229 citations) reviews algorithms/software. Recent: Cartis and Roberts (2022; 38 citations) on scalable least-squares; Πlóskas and Sahinidis (2021; 30 citations) on mixed-integer.
What open problems exist in DFO?
Scalability beyond 20 dimensions, robust noise handling, and efficient mixed-integer globals lack worst-case guarantees (Cartis and Roberts, 2022; Πlóskas and Sahinidis, 2021).
Research Advanced Optimization Algorithms Research with AI
PapersFlow provides specialized AI tools for Mathematics researchers. Here are the most relevant for this topic:
AI Literature Review
Automate paper discovery and synthesis across 474M+ papers
Paper Summarizer
Get structured summaries of any paper in seconds
AI Academic Writing
Write research papers with AI assistance and LaTeX support
See how researchers in Physics & Mathematics use PapersFlow
Field-specific workflows, example queries, and use cases.
Start Researching Derivative-Free Optimization with AI
Search 474M+ papers, run AI-powered literature reviews, and write with integrated citations — all in one workspace.
See how PapersFlow works for Mathematics researchers