Improving Choice Testing and Reviews

Edit: although an old topic new readers might find interest from perusing it from the beginning. Others can jump to February 2023 by clicking here.

I found a nice looking Westinghouse stick blender at Costco that was not included in the tests. It would have been an impulse buy but still I like to check. I understand the reasons products are not tested, so no worries. I then looked at the reviews with the Electrolux TurboPro Stick Mixer ESTM6400 atop the list.

For perspective of the point I am about to hopefully make, Jaguars (esp circa 1960~80) in the US were likened to fine thoroughbred racehorses that broke a leg every time out. Brilliant until that leg broke. Over the years I have purchased a number of items recommended by Choice and previously the US Consumers Reports, that worked well but turned out to be very poor products because of reliability or design issues that only became apparent with use.

In the case of the Electrolux TurboPro, 0 of 5 Choice members would recommend it as of this date, and has posts noting common problems with almost 50% rating it bad or terrible even though it performed very well, for a while…

To Electrolux credit they reached out to their unhappy customers but it is not obvious if there was a good resolution. Without having experience on the product my comment is anecdotal but a poorly designed product is poorly designed, and no amount of replacement or instructions or reaching out to customers will fix it.

Combining data points of 0 of 5 Choice members recommending it and the posts suggests Choice might improve testing/reporting if some online research was done in addition to hands on, and when particular problems are pointed out time and time again, the testing regimen might investigate and address that in the report.

Choice members reviews:

and from productreview


Thanks @PhilT, we appreciate the point and it’s something we hope to continue improving on in the future. I’ll be sure to pass on your feedback to my colleagues.


We had the same problem when we bought Choice’s top rated microwave (at the time). It was a Sharp that cooked really well, but we had loads of trouble with the oven filling up with water constantly. We had to keep pulling out the turntable to mop up the water in the bottom. Also every surface and around the door was always soaked. The reviews for this microwave were generally bad and after a few months it was covered in rust. We ended up trading it in for a Samsung which is quite good and hasn’t even caught fire.


Due to the shear number of items that we test, long-term issues are something that we don’t (often) get a chance to asses. Unlike, say the NRMA/Motorcycle/Car magazines, we really don’t have a set of long-term test items to give reliability information beyond the couple of weeks they’re in the labs.

In the New Things group, that’s something we’re rather aware of. We’ve got several longer-term plans in play to help address this. A small sliver you may have seen is this forum recently introduced a new Reivew post type. Currently these are not linked in to the website, but that’s on our medium term radar (about 2-3 months from now). Additionally we’re linking up our search engine into a collection of feeds of data that will allow us to establish long-term trends and integrate information from various sources - online and offline. We’ve got some more tentative plans, but we’d certainly be happy to take suggestions on how to collect this sort of data.


An easy one that won’t always work due to sometimes short product time in market prior to Choice testing, but have a check on productreview, amazon, manufacturers pages, revoo, etc that reflect unscientific personal opinion and experience, and if there are consistent hits have a closer look at that aspect of the product and reference the area of consistent complaint as one of the many criteria a person should use rather than just “lab performance”. You could ad “no problem in our testing scope” if that made the lawyers happier.

1 Like

Hi there, It’s a fine idea to assess the reviews, though they come in very slowly with regards to the CHOICE site and the astroturfing that occurs outside of the CHOICE site (astroturfing being the term for fake reviews).

I have heard it bandied about that around 30% of reviews are fake. We we don’t have oversight of them, so cannot trust them.

We rely on CHOICE member feedback on these items more than external agencies. You’re much more reliable.

But as stated, reviews come in slowly. We’ve removed recommendations from products previously because of the sheer volume of member feedback, though it doesn’t happen often (last time was a F&P top loading washer).

As an aside, interesting that CostCo are doing Westinghouse brands.


The suggestion was not a binary switch, it was a recommendation to have a further look when there were many similar problems. If you trust Choice members, the 0 of 5 recommendations on the Electrolux and the concurrent comments on productreview seems to be a genuine red flag.

I would not read anything in particular about Costco selling Westinghouse since many of their lines vary regularly. For our market I suspect they might get a container of this or that and sell it until gone. In my experience they religiously stand behind everything sell. In the US they seem to be more consistent as well as offering greater ranges.

1 Like

That is correct. Costco often acquire left over crates of end of line/model for many products - both electronics and fashion. They go on sale and when they’re gone, they’re gone. You do get some fabulous bargains, but it won’t be for brand new models of things.

(FWIW I lived in the USA for 10 years and had Costco memberships the entire time there. The business model here appears to be identical)

1 Like

I agree with the BBG letter. I was a member of Choice fore many years and purchased recommended products only to find problems with reliability and longevity. Also Choice were quite selective on products which did not represent all in the particular genre. One example of selective testing was with rotary mowers, only four stroke models were tested , due to pollution problems with two strokes (according to Choice). Two stroke products are still widely sold, due to their simplicity and lighter weight and their greater power for a comparable engine capacity. As I was in the market at the time for a mower, I had no option but to make my purchase without any input from Choice. My old mower a Victa was some 24 years old and had seen some very heavy work, eg clearing rough blocks etc, so I decided to update to one of their later models which so far has proven quite satisfactory

1 Like

As long as they are not religiously standing behind things in the same way as have some Roman Catholic priests.

(Too soon?)

1 Like

Yes, 3 years later I resurrect the topic.

In the market for a new dryer I first looked at the Choice reviews. The top models seem to be superseded items so have been on the market for a while. Both test well and are Choice Recommended. Focusing on the second, the member reviews are not so favourable and reviews on the current retailer websites that feature the product are weighted toward the poor side of the spectrum.

Therein lies a problem with Choice reviews that products that test well are not always good in the consumers’ hands. Two suggestions:

  1. for each member review require them to tick ‘recommended’ or ‘not recommended’ and require them to verify the review is for the tested product, to make a submission.

In the product test review and comparison table add a line

X Choice Members recommend / X Choice Members do not recommend

  1. the member reviews are generally hidden at a lower level and seem to be all but invisible excepting on the ‘see what our members…’ links in the review tables. Why aren’t they as visible (google, higher level on as say productreview or retailers review links? They should be, even if locked to member content/subscription walled.

As for the dryer, expert rating 84%, choice recommended, only 2 of 6 members who reviewed it recommend it!, productreview shows
and to be fair, google aggregate of reviews is
weighted from a retailer who no longer sells the model.

The Choice members, productreview, and appliancesonline are consistent. Google’s aggregate has many reviews from a retailer site that no longer sells the product and the lack of detail in those reviews makes them questionably relevant, but not suspect they are real reviews of ‘a dryer’.

Members who buy a Choice recommended product and expect a good experience but get the reverse has to be bad publicity at best, and I suggest it is evidence reviews could and need to include more ‘customer experience’ content even though it cannot be verified or quantified in a score. The cited case was enough for me to look to another product, but the newer product is yet to be tested and reviewed so at the end of the day I might punt, or stay until the newer product is reviewed and published before it too is superseded.


Hi @PhilT, can’t agree more with your sentiments. I can recall writing to Choice, eons ago pre this forum, regarding their testing of coffee machines. I can’t recall the details but one issue was how they tested the coffee temperature, which I felt was less than adequate.

So now I use the Choice reviews as follows:

  1. For big ticket items I use the Choice testing as just one criterium. I’ll look at other web reviews with the caveat that the seller may only post the good reviews. More importantly we go down the street and look at and assess all the alternatives.

  2. For low price items that I want quickly I’ll just run with the Choice recommendation

  3. For small kitchen appliances e.g. kettles, toasters and the like I check if if they are available with my VISA rewards points and if so, order the item from there.


There is often a disparity between Choice recommendation and user reviews, especially for smaller items such as food processors.

  1. Large electrical products are tested for durability, whilst small ones are not. Often user reviews refer to faults occurring quite quickly and repeatedly. For the testing to be seriously helpful we need durability testing results.
  2. The narration that eg “1 of the Choice Community would recommend product” would be more helpful if it said “1 out of 4 respondents would recommend …”

Cost is likely to be an issue. There are often many models available in small appliance categories and durability testing is time consuming. For example there were 48 electric kettles tested. What would it cost to have a wall full of kettles going for long enough to get sensible data?

Welcome to the community @krome,

You can see from this topic your post is not the first time issues have been raised. I welcome all the re-enforcement possible.

I don’t think Choice formally tests for durability excepting when it can be simulated, such as mattress compressions or luggage abuse; reliability data for large appliances generally comes from member questionnaires on reliability, warranty claims, and so on.

The disparities between tests and Choice recommendations and member/user recommendations are that Choice does quantitative laboratory tests of performance, while the member/user reviews are longer term with the product in a typical consumers hands but while they can point to good and bad through the trends or weights, they are nevertheless anecdotal.

If long term reliability testing were done in the lab, publishing dates for same would be so long as to be irrelevant considering purchasing time lines and product cycles, so I have empathy for how that part has been pragmatically done. There is also insufficient lab space to ‘park’ products without delaying other tests (eg a shortage of lab real estate).

What we ‘violently agree on’ is that Choice needs to reconcile the disparities better than they have been. Choice is about product tests and the comparatively recent focus on consumer advocacy and related issues has not supplanted that. I hope they reflect and keep up and improve their good work, not just broaden their work and rest on business as usual for the lab work without better integrating user experiences.

Note @BrendanMays reply from July 2017. My calendar shows May 2020


Hi @krome, welcome to the community

There could be a number of factors why not to focus on user or reliability feedback on every product:

  • Number of items sold: It is worth noting that lower cost smaller items possibly sell more products than larger more expensive items. As a result, one can’t really go off the number of consumer reviews being posted online (either positive o negative) unless one knows the total number of products sold in Australia. For example, a high end fridge which sell say 10 in Australia compare to a toaster which sells 10000. If for example, the two consumer posted negative feedback about the high end fridge and 100 about the toaster, this product is more reliable? If one doesn’t know the number of products sold, one might think that the toaster is less reliable than the fridge. In reality, it is other other way around as 20% of sold fridge users posted negative feedback while only 1% of toaster users did.
  • Product durability:Another consideration is newly released products may be positive initially, with negative feedback trending in the longer term as the long term durability of the product is known. One may make a purchase in the initial review phase only to find out later, like most other users, that the long term durability is poor.
  • Does every user post negative or positive feedback: Quite often individuals will only post negative feedback…or if they plan to post positive feedback and every other post is negative, this may turn them off posting negative feedback as they may question their own thoughts on the product. Consumers are more likely to hop online and complain about a product which had a fault or didn’t work to their expectations. This biases online consumer/user reviews towards the negative and may not necessarily portray the real situation in relation to the product reliability or quality. High percentage of positive reviews may also not adequately reflect the product as well as if the sample number is small, it could be false/fabricated reviews (which some companies have been caught out doing their own reviews).
  • Service life history: Choice carried out regular product reliability surveys, and these are usually limited to more expensive appliance purchases which are expected to have a long service life. To gather information on each and every appliance, device or product sold would be onerous and very expensive exercise. These surveys also focus on the major brands which an option for other brands to be included (which is often seen in survey results as there is often an other category. Again, if Choice surveyed every brand, it would be onerous and costly…and potentially have limited benefit as most consumers tend to purchase the known, recognised appliance brands. Appliances which have a long service life Choice can also gauge an idea of the service support offered and also problems which occurred within and without the warranty period. For smaller cheaper products, usually defects are not known (they are thrown away or returned for refund/replacement in warranty) and one would be guessing what the problem was.

There are websites like Product Review, but one must remember that these tend to bias towards negative reviews and also are a small sample of the product users. Such websites while are useful for anecdotal information, one has to be careful in relying solely on such information. As outlined above, there are many biases or false impressions could easily be gained from such websites. Choice if it did start to look at user feedback from small products, it would likely experience the same or similar problems in relation to whether the user feedback received truly represents the product in question.

In a perfect world, I would love to have such information available…but in reality…it is unlikely to ever be practicable.

1 Like

Thanks for the feedback all. When we as consumers are making product purchasing decisions, there are a variety of sources we can utilise to inform our decisions. Research indicates that word of mouth remains one of the most influential factors, and we all also typically use a combination of user reviews and expert reviews. In my opinion, it is worthwhile using a variety of sources. Probably the biggest strength of a user review is chancing upon a reviewer who is in a similar user case position, who can offer some personally relevant insight (if you can trust it is a genuine perspective).

There is a good reason for the potential disparity between user reviews and expert reviews, and respectfully, I’m not sure that there is a need to reconcile these differences directly from CHOICE’s perspective, although we do intend to prioritise what members have to say and to continue improving and developing avenues for people to provide input. However, we can say with certainty that the system of mass consumer reviews alone also has some significant downfalls we should take care with. For example, we as consumers are particularly swayed by price, branding and marketing. It is often the case that premium brands are perceived to perform better or be more durable, but our testing often reveals this perception is not based on facts.

A regular consumer rarely has the opportunity to thoroughly test a range of products in a controlled environment. There is often an emotional element to consumer reviews, it can be connected to other unrelated events occurring at the time or to personal trust/value judgments happening through the recommendation or purchase process. Was the review item in question a gift from family or a dear friend? Were there price or salesperson influences at play? These things will influence the perception of the average genuine reviewer. I don’t think I need to cover the issue of fake or purchased reviews, it’s estimated that 30-40% of reviews on a given site are potentially fake or influenced.

There is a host of research out there disconnected from us that will confirm the message I’m relaying, take this study from the University of Colorado for example.

Addressing the durability point, there are some ongoing lab tests in place at CHOICE. We mainly undertake mass surveying of consumer experience, the same people over time, to address durability and quality issues in a real world setting - it can change our recommendations or alter our tests. Our lab experts also have a deep understanding of manufacturing quality, as we have the opportunity to undertake types of manufacturing testing or access manufacturers from an independent position that isn’t accessible to most reviewers, professional or otherwise.

That aside, I’m not convinced that most user recommendations address the issue of manufacturing quality, long term usability or durability anyway. Most reviews are prompted to be made soon after purchase for a start. In mass markets, we also have a factor where there are always going to be some issues experienced for whatever reason by a smaller percentage of people that doesn’t represent the whole. Sometimes this will be a manufacturing fault or issue that has drawn several hundreds to remark when the real majority is in the hundreds of thousands. Other issues such as how the product is used or factors out of control (power surges or brown outs for example) can skew user review information.

The problems raised above can occur on a large scale, which can create a false trend even across different platforms. The impact of branding means that you can have a poor product and an excellent product within the same category, even the same product range. What indicators are provided from users are unlikely to be precise in the ability to identify these issues and more, except perhaps in the most acute cases (where explosions or safety issues are occurring).

As mentioned, for these reasons, I can’t see that CHOICE would seek to reassess our recommendations based on differences between our results and mass user reviews. However, that doesn’t mean that we are not evolving, we are listening to what people are saying on review or social platforms so that we can improve our testing and reviews. Here’s some examples of things being continually improved or currently prioritised for improvement:

  • The way we display user reviews on our site is currently being addressed
  • The testing process in general can be tweaked as new needs or information is provided, this is balanced with the need to provide consistency and clarity in our advice
  • The way we explain the difference between a popular trend and an expert opinion can be brought forward or explored in more depth
  • We will continue to undertake extensive and individual-level communications on real world experience to develop statistically significant and sound trustworthy advice
  • We also seek to broaden our discussions with as many Australians as possible on a range of different platforms, and to enable different information sources alongside expert opinion
  • We accept that we can’t resolve every personal need at an individual level on a mass scale, but we are personalising tools and reviews as much as possible. One example is with our health insurance review, which has gone from a static recommendation to an interactive one

Please keep adding your thoughts, we will consider them all carefully and make them available to different departments in the organisation.


Am I the only person who read the very detailed recent food processor review with a sense of disbelief? EVERY single Choice-recommended food processor receives what can only be described as scathing usage review from Choice members. On the face of it, you’d be a raving lunatic to buy a single one of the food processors if you followed the recommendations of Choice. Apparently not a single one of the multitude of faults manifested by ANY of the units was detected by the Choice testing process - jammed blades, cracked handles, leaky tops, wandering feet, under-powered motors…the list is apparently endless.
As someone who is keen to buy a semi-decent food processor to replace our antique (25 yo), but very faithful, Moulinex - the ‘on’ button finally crumbled apart from age and over-use, but the unit itself continues to function undeterred - I am astonished that apparently not a single manufacturer is capable of producing a unit that exhibits even minimal functionality for longer than roughly twelve months, according to Choice users. Some units fail basic functionality (chopping nuts, anyone?) on FIRST use. And price is utterly no guide at all.
What exactly was Choice thinking of? Far from going to the bother of publishing this utterly useless and apalling review, I would have hoped that Choice would have publicly upbraided the raft of manufacturers for peddling plastic toys instead of the sturdy kitchen appliances that we should expect of any food processor.
I am very disappointed in Choice.

1 Like

Hi @Ozcelt, I can see that you’re very frustrated, my apologies that our review hasn’t been able to assist you. I’m impressed that your existing food processor has served you well for 25 years, our testing indicates that a modern high end food processor can last over 20 years with maintenance and repairs (7 years without).

Our food processor reviews are member content, so for this reason I’ll avoid sharing our recommended models. You’ve mentioned:

I’m assuming that you’re referring to the member reviews on the product pages, but please let me know if this isn’t the case. We’ve allowed member reviews on these pages, but many of the products do not have a large number of reviews, or any member reviews at all. We realise this is an area where we could improve. At the time, I’m not sure that it is accurate to say that every one of these reviews is scathing. On the top the product ‘1 in 6’ reviewers recommend this product, on the second product 5 in 9 reviewers recommend:


If you go through each of the recommended products, you’ll see examples where all reviewers recommend the product

For your information, we do incorporate member feedback into our product reviews, but not necessarily the public comments that have caused you concern. While it is hard to perform perfect long term, mass market assessments, we do have an extensive member survey that has been running over a number of years to help us gauge real world performance over time using a large number of cases. This is combined with the lab testing we perform to provide the scores members see on the main page of the food processor review.

The CHOICE Community is another place you can ask for user recommendations, I’d welcome you to do so in this thread. I hope this information helps temper your feeling toward our review, and thanks again for this feedback.


Darn Brendan, you had to be so calm and polite in your response…my righteous anger has been entirely dissipated! Now, I just feel a bit…silly?
Amyway, You’re correct of course, there are positive reviews, but I was honestly shocked by how many customer reviews of the recommended units were so negative - mostly, or at least many, suggesting that build quality of modern food processors is - at best - appalling. I am keen to buy a processor, but I couldn’t find a single one in the revierw that did not make me feel at least uneasy. That’s a real shame, because the automatic response in such circumstances is simply to purchase the cheapest, nastiest unit that will last a year and can then be replaced by another cheap and nasty unit every year - and save a heap of maney lost on purchasing a supposed “quality” unit that cracks, dies or underperforms no less frequently. I guess that the manufacturer’s loss. They seem not to care though. So, pity.