Structuring Experiments to Get Answers#

A proof of concept is an experiment, and experiments produce useful results only when they are structured to answer a specific question. Plugging components together and seeing what happens can be fun, but “it seems to work” is not an answer a system can be designed around. A little structure turns a POC from tinkering into engineering.

Define the Question First#

Every POC should start with a single, specific question:

  • “Can this accelerometer detect vibrations below 10 Hz with at least 10 mg resolution?”
  • “Does this buck converter maintain regulation with a 2 A load step?”
  • “Can this LoRa link achieve reliable communication at 500 meters through two interior walls?”
  • “Does this op-amp topology maintain less than 1% THD at 20 kHz?”

Vague questions produce vague answers. “Does this sensor work?” leads to a breadboard that sort of works under ideal conditions, which reveals nothing about whether it will work in the actual application. Specificity forces a definition of what “works” means before building begins.

Define Success and Failure Criteria#

Before applying power, write down the expected results and what would constitute a pass or fail:

ParameterPassFailMethod
Sensor noise floor< 5 mg RMS> 10 mg RMSLog 1000 samples, compute RMS
Regulator ripple< 50 mV pk-pk> 100 mV pk-pkScope with AC coupling, tip-and-barrel probe
Link range> 500 m at -120 dBm< 300 mWalk test with RSSI logging
THD< 1% at 1 kHz> 2% at 1 kHzAudio analyzer or FFT

This table is the experiment design. It specifies what to measure, how to measure it, and what the result means. Write it before building anything.

The pass/fail criteria come from the requirements. If the requirements say “detect vibration below 10 Hz,” the POC must test at frequencies below 10 Hz with a known reference vibration. Testing at 100 Hz and assuming the sensor will also work at 10 Hz is not a valid POC.

Control the Variables#

A meaningful experiment changes one thing at a time and holds everything else constant:

  • Test the sensor, not the power supply. Use a clean, well-regulated bench supply so that power supply noise does not contaminate sensor measurements. If the sensor fails, it is important to know it was the sensor, not the power.
  • Test the communication link, not the antenna. Use the module’s stock antenna (or a known reference antenna) for range testing. If a custom antenna is also being evaluated, that is a separate experiment.
  • Test the circuit, not the firmware. For analog POCs, use the simplest possible firmware (or no firmware at all — a signal generator and a scope). Firmware bugs create confusing results when the goal is to evaluate hardware.

When multiple things must change simultaneously (because they are coupled), acknowledge it. “This test evaluates the sensor and the ADC together because they share a reference voltage” is honest and useful. “This test evaluates the sensor” when it is actually measuring the sensor + ADC + firmware + power supply together is misleading.

Capture the Data#

The most common POC failure mode is not a broken circuit — it is lost data. A week later, when the architecture discussion needs to reference the POC results, the breadboard is disassembled, the scope screenshots are lost, and the only record is a vague memory that “it seemed to work.”

Minimum documentation for every POC:

  • Photo of the setup — what is connected to what, which dev board, which breakout, how it is wired.
  • Schematic as built — not the ideal schematic, but what is actually on the breadboard, including any modifications made during testing.
  • Raw data — scope screenshots, data logs, serial output captures. Raw data can be reanalyzed later; impressions cannot.
  • Results summary — one paragraph stating what was tested, what was measured, and whether it passed. Include the numbers.
  • Surprises and observations — anything unexpected. The sensor worked but had a 50 ms startup delay. The regulator met the ripple spec but ran hotter than expected. These observations feed into architecture decisions.

A lab notebook (physical or digital) is the traditional tool. A markdown file in the project repository works just as well. The format does not matter — the act of recording does.

Design for Iteration#

A well-structured POC makes the next iteration easy:

  • Use sockets for ICs so components can be swapped without desoldering.
  • Use potentiometers for resistor values expected to need tuning (bias points, gain setting resistors, feedback dividers). Replace with fixed resistors once the right value is found.
  • Break the circuit into testable stages. Build and verify the power supply before connecting the analog front end. Verify the analog front end before connecting it to the ADC. Stage-by-stage testing isolates problems.
  • Include test points. Even on a breadboard, leaving accessible points for scope probes on critical signals (power rails, clock lines, signal nodes) makes measurement faster.

When the Experiment Does Not Give a Clear Answer#

Sometimes the result is ambiguous — the sensor meets the spec at room temperature but temperature extremes are uncertain, or the link works at 400 meters but the requirement is 500 and further testing is not easily arranged.

Options:

  • Add margin to the test. If the requirement is 500 meters, test at 600. If it works at 600, there is confidence in the margin. If it fails at 450, the answer is clear.
  • Test the sensitivity. If the result is marginal, figure out what is limiting performance. Is it noise, signal strength, bandwidth? Understanding the limiting factor reveals whether the design can be improved or whether the component is fundamentally inadequate.
  • Defer the question honestly. “The POC showed the sensor works at room temperature. Performance at -20 degrees C is unverified and should be tested on the prototype PCB with a temperature chamber.” This is a legitimate POC outcome — it answers what it can and clearly states what remains unknown.

The worst outcome is to declare the POC a success when the data is actually inconclusive, then build a system on that shaky foundation.

Tips#

  • Write the pass/fail table before applying power — it prevents post-hoc rationalization of ambiguous results
  • Change one variable at a time and hold everything else constant; when coupled variables must change together, document that explicitly
  • Capture raw data (scope screenshots, serial logs) during every test session — impressions fade, but data can be reanalyzed
  • Break the circuit into testable stages and verify each one independently before connecting them together

Caveats#

  • Confirmation bias is real. After spending three hours building a breadboard, there is a natural desire for it to work — be rigorous about the pass/fail criteria defined before testing, not the criteria that seem appealing after seeing the results
  • One successful trial is not enough. A LoRa link that works once at 500 meters proves it is possible, not reliable — run multiple trials and compute statistics if the measurement allows it
  • Environment matters. A sensor tested on a quiet bench in the lab may fail in the noisy electrical environment of the final application — if the application environment is known, try to approximate it during the POC
  • Temperature is a variable that cannot be ignored. Many components behave differently at temperature extremes — if the application will see -20 degrees C to +60 degrees C, at minimum note that the POC was tested at room temperature only
  • Do not optimize the POC. The goal is to answer a question, not to build the best possible breadboard — if hours are being spent tweaking component values on a breadboard to squeeze out another dB of performance, stop; the question has been answered (“it is marginal”) and the optimization belongs in the schematic design phase