How I test products at BTTR

Every review published on BTTR is written by one person, after hands-on testing, with no commercial influence on the outcome. Here's exactly how I do it.

Nick Broughall
Nick Broughall

Table of Contents

BTTR is independent. We may earn a commission if you buy through our links.
Why trust us? · How we test

Most review sites will tell you they're independent. Some of them are.

Here's how I know BTTR is: I'm the only person who writes the reviews.

Every single review published on BTTR was written by me, Nick Broughall, after weeks of hands-on testing.

That means you get a consistent standard across every review You get a consistent voice, and a consistent set of priorities. It’s not different people with different opinions working to different briefs.

I've been reviewing consumer technology for 24 years. Before BTTR, I edited T3 magazine, was the editor at Gizmodo Australia, the managing editor at TechRadar Australia, and the global group publisher at Finder.

I've reviewed everything from flagship smartphones to budget blenders, and regardless of what I review, the approach has always been the same: use the thing properly, work out what's good, work out what isn't, and tell you straight.


By the numbers

24 years Tech journalism experience
246 (and counting) In-depth reviews published on BTTR
1 Reviewer (me, Nick)
$0 Received to change a verdict

What I look for in every review

Before getting into category-specific approaches, there's a framework I apply across everything I test. While some elements have changed over the past few years as I fine-tuned my approach, it has mostly remained consistent:

What makes this product stand out? Why should you pick this product over some other product? Is it bigger? Faster? Cheaper? Or does it have some cool gimmick that sets it apart? It’s always good to know

Design and build quality. How does it feel? Is it built to last? Does it look like it's worth what they're asking for it? Does the design have practical advantages or disadvantages worth knowing about?

Performance. How does the product do its job? Does it do what it says on the box, in real conditions, with real use? Where possible, how does it compare with similar products? Can you figure it out without reading the manual cover to cover?

Value for Australians. Prices are always in AUD. Availability is always Australian. A product might be exceptional value at US$90, yet poor value at $179, and I'll tell you which one it is.


My testing process

Step 1: I get the product. To be honest, as an independent creator, I don’t have the budget to purchase products myself. That means brands or PR agencies generally supply review units. Every review discloses where the product comes from. Either way, commercial relationships don't influence the outcome.

Step 2: I use it in the real world. Not in a lab. At home, in the kitchen, on commutes, during a walk… Wherever it's designed to be used. Testing time varies by product, but I try to test something for at least two weeks before I start writing my review. If I don’t get that long for any reason, it’s disclosed in the review.

Step 3: I benchmark where it counts. For some categories (smartphones, laptops), I run standardised performance tests to give you numbers you can compare across reviews. But for most products, my judgment is the benchmark.

Step 4: I check the specs. After testing, I verify my experience against the manufacturer's claimed specifications. If something doesn't add up, I investigate and say so.

Step 5: I write the verdict. Should you buy the product? It depends on a lot more than my opinion. But my verdict tries to answer this question as best I can based on my hands-on experience.


How I test by category

Here's how I approach testing in each of the categories I cover at BTTR.

Smartphones and mobile devices

Smartphone reviews run for at least a week of extensive use. I use the phone to make calls, take photos, browse, stream, play games. You know, do everything a phone is supposed to do. I take note of:

  • Battery life with regular daily use
  • Camera quality across multiple lighting conditions: daylight, indoors, low light, and night mode
  • Software experience, including any pre-installed bloat
  • Performance using standardised benchmark tools

Phones are typically tested using the Optus network, though older reviews used Telstra, based on my mobile plan at the time.


Audio (headphones, earphones, speakers, soundbars)

Audio testing is inherently subjective. My preferences lean toward balanced sound (I'm not chasing bass-heavy consumer tuning), and I flag when a product's tuning is clearly tailored to a specific preference.

For headphones and earphones, I listen across a consistent mix of genres chosen to stress different parts of the frequency range. Testing covers:

  • Frequency balance across lows, mids, and highs
  • Noise cancellation performance in real environments, not quiet rooms
  • Microphone quality for calls and voice recording
  • Battery life from full charge
  • Comfort during extended wear (at least 1 to 2 hours continuous)

For speakers, I test at a range of volumes in a realistic room environment, not up against a wall in a corner, and not in an acoustically treated studio.

For soundbars, I test with both movies (action scenes with dynamic range requirements) and music, and note whether spatial audio formats like Dolby Atmos actually make a difference.


Cameras and photography

Some Camera reviews dive deep into the technical nature of photography. My reviews are geared more towards the general consumer; someone looking to move up from their smartphone to a dedicated device. My reviews cover:

  • Image quality across daylight, overcast, indoor artificial light, low light, and night conditions
  • Video quality at the highest supported resolution, including stabilisation performance
  • Autofocus speed and tracking accuracy
  • Battery life across a day of mixed shooting
  • App and ecosystem experience
  • Comfort and design of the camera itself

I shoot predominantly in automatic mode, but do test manual modes where relevant.


Computing (laptops, tablets, desktops)

Computers are tested as my daily driver. That means I complete most of my daily work on the machine, including writing, research, video playback, and anything else the product is positioned for.

I run standardised benchmarks on every device to provide comparable performance data. I also assess:

  • Battery life based on my own real-world experience
  • Keyboard and trackpad quality on laptops
  • Display quality, including brightness and outdoor visibility
  • Thermal management: Does it throttle, does it run hot, does the fan become distracting?
  • Port selection relative to what Australian users typically need

TVs, projectors and displays

Screen testing is the category I don’t always get to test for a two week period, as the logistics of moving a TV in and out of the home gets unwieldy. Often, TV companies will set up a room in a hotel or office for testing purposes, and I will get a limited window (6-12 hours) to experience the product.

In other situations, I get the TV delivered and set up in my loungeroom for extended testing of at least two weeks. I look for:

  • Picture quality across SDR, HDR, and Dolby Vision content where supported
  • Motion handling and refresh rate performance
  • Input lag for gaming, measured separately from cinematic use
  • Smart platform quality and responsiveness
  • Sound quality from built-in speakers
  • Setup and calibration experience out of the box

Smart home and security

Smart home products get an extended testing period (usually 2 to 4 weeks), because reliability over time matters more than first impressions. I assess:

  • Setup experience, including app quality and whether an account is required just to turn a light on
  • Reliability across multiple days of use: does it drop offline, does it miss events?
  • Integration with Google Home, Apple Home, and Amazon Alexa where relevant
  • Privacy practices, including what data is being collected and where it's going
  • Whether any subscription model represents fair ongoing value

Security cameras are assessed in my actual home, with footage quality assessed across day, dusk, and night conditions.


Kitchen and home appliances

Appliance testing is hands-on and functional. I use the product for its intended purpose across multiple sessions… I won't review a blender on one smoothie. Testing covers:

  • Performance at the product's core function
  • Build quality and materials
  • Ease of use, including controls and clarity of the interface
  • Ease of cleaning
  • Noise level where relevant
  • Value relative to alternatives at the same price point in Australia

Vacuum cleaners (including robot vacuums)

Vacuum testing runs for at least two weeks of regular use across my home. For stick and upright vacuums, I assess:

  • Suction performance on hard floors and carpet
  • Battery life and whether the runtime matches the claimed figure
  • Noise levels
  • Dustbin capacity and ease of emptying
  • Weight and manoeuvrability, especially for longer cleaning sessions

For robot vacuums, I run the full setup process including mapping, scheduling, and any accompanying app. Testing covers:

  • Navigation and obstacle avoidance in a real home environment
  • Cleaning performance across hard floors and low-pile carpet
  • Dock performance, including auto-empty where applicable
  • App quality and smart home integration
  • How well it handles edge cases: cables, chair legs, uneven transitions between floor types

Fans and air purifiers

For fans and air purifiers, I assess:

  • Core performance at the product's primary function
  • Noise levels across settings — especially important for products used overnight
  • App and smart home connectivity where applicable
  • Energy efficiency where data is available
  • Ease of setup and ongoing maintenance (filter replacement for purifiers, for example)

Health, grooming and personal care

These categories require extended use to assess properly. I use the product as part of my regular routine across multiple weeks and assess:

  • Core performance: does it do what it claims?
  • Build quality and durability over time
  • Ease of use and ergonomics
  • Battery life and charging convenience where applicable

Wearables (smartwatches, fitness trackers, rings)

Wearables are worn continuously during the testing period. I assess:

  • Fitness and health tracking accuracy, compared against other devices worn at the same time where possible
  • Battery life from full charge with typical use
  • Smart features and notification handling
  • App quality and how useful the data actually is
  • Comfort for all-day and overnight wear

A note on editorial independence

BTTR doesn't accept payment to influence a review verdict. Ever.

When a product is supplied for review by a brand or PR agency, I disclose it prominently in the review. My expectation is always that products get returned after reviewing… And if you could see my garage, you'd understand that brands don't always follow through on collecting them.

My point is: I'm not writing positive reviews to get free stuff. I'm reviewing them to tell you whether they're worth buying.

I approach reviewing as a form of constructive criticism, which means I will communicate the weaknesses of a product just as strongly as its strengths.

BTTR earns affiliate commissions on some purchase links. That income helps keep this publication running as an independent operation. It doesn't influence which products I recommend or what I write about them. You can read more about how BTTR makes money.


My approach to verdicts

I don't use numerical scores. I used to, but I've tested enough products to know that a 7.5 tells you much less than a clear explanation of who a product is for and who it isn't.

Every BTTR review ends with a Buy if / Skip if section. These are the specific conditions under which I'd recommend you buy (or avoid) a product. Hopefully, they answer the actual question you came to my review to answer: should I buy this?

If you've got a question about a product I've reviewed — or one I haven't — you can always get in touch.