top of page

Quantum Annealing: What’s the Use?

  • Writer: David Wood
    David Wood
  • 4 days ago
  • 4 min read

Updated: 3 days ago


Why the real game changer in advanced computing starts with properly defining the use case. A guest article by Walter Hough, Vice President of Algorithmic Warfare, Davidson


Over the past four years at Davidson, I’ve heard some version of the same opening line again and again:


“We really want to use quantum on this.”


“Annealing is supposed to crush problems like ours, right?”


“We have petabytes of data—can we just throw it at your machine and see what happens?”


I appreciate the enthusiasm. But without a crystal-clear understanding of the decision you’re actually trying to improve, even the most advanced hardware becomes nothing more than a very expensive science project.


And that’s not what anyone wants.


Decision advantage is the whole point


Look at what the Department of Defense’s Office of the Under Secretary of Defense for Research and Engineering (OUSD R&E) has identified among its Critical Technology Areas: Contested Logistics and Quantum and Battlefield Information Dominance (Q-BID).


The common thread is simple: winning comes down to making smarter decisions, faster—especially when conditions are messy and timelines are compressed.


Contested Logistics isn’t really about trucks and ships. It’s about deciding where to send the last few pallets of critical parts when half your communications are jammed and the threat picture keeps shifting.


Q-BID isn’t about raw compute power. It’s about deciding whether to activate a radar, go dark, reposition, or strike—when the information you have is incomplete, delayed, or potentially misleading.


These are mission-driven decision problems. The role of advanced computing isn’t to impress. It’s to improve those decisions—faster, more confidently, and under harsher constraints.


When you start with mission outcomes, advanced optimization becomes an enabler of classic systems engineering—not a distraction from it.


Why starting with the tech screws things up


When conversations begin with “quantum is amazing” or “annealing is the future,” predictable bad things happen:


You start bending the mission to fit the tool.You quietly ignore real-world constraints—time, bandwidth, power, classification, risk.


You build something that looks great on a slide and it changes nothing about how real decisions get made.


You get a demo.

You don’t get decision superiority.


That distinction matters.


Ignore a world of perfect datasets and benchmarks for a second. A real use case has four things going for it: 


  • A decision that actually moves the needle. If the answer doesn’t change what a commander, planner, or autopilot does, it’s not ready. 

  • Nasty, real constraints. Time windows, threats popping up, tiny bandwidth, low power budgets, geometry, acceptable risk levels. Complexity! 

  • A clear way to win. Not perfect optimality. Just faster, more confident, or more resilient than whatever we’re doing today. 

  • It fits into how people actually work. The output has to land with operators or planning cells. 


When those pieces are in place, suddenly advanced optimization and annealing can deliver. 

The first step is turning nebulous operational needs into well-defined problem shapes and identifying which constraints actually matter versus which ones are negotiable.


Why early partnership matters


This is where many promising ideas fail: too much focus on the tool, not enough on the decision.


Early partnership allows teams to:

  • Translate nebulous operational needs into real problem shapes

  • Identify which constraints truly matter

  • Determine whether classical, hybrid, or annealing approaches even make sense

  • Avoid wasting time on problems already solved well by existing methods


That grounding keeps expectations realistic, timelines credible, and risk visible.

If you’re wrestling with messy decisions, contested environments, time-critical tradeoffs, or resilience when systems are breaking, that’s where advanced optimization belongs.


Not as a showcase—but as an enabler.


Early partnership helps keep everyone grounded in real risk, real timelines, and realistic expectations. If you are wrestling with messy decisions in contested environments, optimization under severe time pressure, or resilience when systems are degrading or breaking, reach out. Access to the right technology matters.


So, what’s the use? That is the question worth answering first.


Davidson’s perspective

 

This is where Davidson Technologies' legacy matters. Our background in systems engineering, integration, and test on strategic and national-level systems shapes how we approach advanced computing. Interested in talking?


Reach out to me with a quick message on LinkedIn or email me directly to learn more about how Davidson is partnering with businesses, both small and large, that have HPC requirements, complex problem sets, or general interest. 


About the Author

Walter Hough is Vice President of Algorithmic Warfare at Davidson, where he leads mission-driven optimization, advanced computing, and decision-support initiatives focused on real-world operational impact.


With a background rooted in systems engineering, integration, and test on strategic and national-level systems, Walter specializes in translating complex, high-constraint mission needs into actionable computational approaches—spanning classical, hybrid, and quantum-enabled methods. His work centers on ensuring that advanced technologies deliver true decision advantage, not just impressive demonstrations.


This mission-first approach is shaping how leaders across defense and national security are thinking about quantum adoption. Davidson CEO Dale Moore recently explored this shift in a conversation with The Washington Times, discussing how quantum and advanced computing will shape the future of defense.




 
 
 

Comments


bottom of page